COR Brief
Code & Development

Optuna

Optuna is an open-source Python library designed for automatic hyperparameter optimization of machine learning models. It was introduced in 2018 by Preferred Networks and supports dynamic construction of search spaces during code execution through its define-by-run API. The framework efficiently searches large hyperparameter spaces, discards unpromising trials early, and supports parallelization across multiple threads or processes. Optuna integrates with popular machine learning libraries such as PyTorch, TensorFlow, XGBoost, LightGBM, Keras, and Catboost. It also provides a dashboard for real-time monitoring of optimization progress and hyperparameter importance.

Updated Jan 7, 2026open-source

Open-source Python library for automatic hyperparameter optimization with dynamic search space definition and parallelization support.

Pricing
open-source
Category
Code & Development
Company
Interactive PresentationOpen Fullscreen ↗
01
Allows dynamic and Pythonic construction of hyperparameter search spaces using conditionals and loops during code execution.
02
Includes algorithms such as TPE, CMA-ES, GP-based Bayesian optimization, NSGA-II for multi-objective optimization, and pruning of unpromising trials.
03
Supports parallel execution of optimization trials across tens or hundreds of workers with minimal code changes.
04
Compatible with machine learning libraries including PyTorch, TensorFlow, XGBoost, LightGBM, Keras, Catboost, MLflow, and Weights & Biases.
05
Provides real-time visualization of optimization history and hyperparameter importance through graphs and tables.

Hyperparameter Tuning for ML Models

Machine learning practitioners can use Optuna to automatically optimize hyperparameters of models built with frameworks like PyTorch or TensorFlow.

Multi-objective Optimization

Users can perform optimization involving multiple objectives using algorithms like NSGA-II supported by Optuna.

1
Install Optuna
Run pip install optuna ensuring Python 3.9 or higher is used.
2
Define Objective Function
Create a function that takes a trial object and returns a score to optimize.
3
Create a Study
Initialize a study object using study = optuna.create_study().
4
Run Optimization
Call study.optimize(objective, n_trials=100) to start the hyperparameter search.
5
Visualize Results
Use built-in plotting functions or launch the dashboard with optuna-dashboard to monitor progress.
📊

Strategic Context for Optuna

Get weekly analysis on market dynamics, competitive positioning, and implementation ROI frameworks with AI Intelligence briefings.

Try Intelligence Free →
7 days free · No credit card
Pricing
Model: open-source

Optuna is free to use and open-source, installable via pip with no paid plans.

Assessment
Strengths
  • Supports dynamic search space definition at runtime without requiring predefined grids.
  • Integrates with multiple popular machine learning libraries including PyTorch, TensorFlow, and XGBoost.
  • Enables easy parallelization to scale optimization across multiple workers.
  • Provides visualization tools and a real-time dashboard for monitoring optimization progress.
  • Supports multi-objective and constrained optimization.
Limitations
  • Primarily focused on Python, limiting usability in other programming languages.
  • Requires manual definition of objective functions, which adds code overhead.