Sigopt
Optimize ML model hyperparameters, discover best configurations, track model performance.
Optimize ML model hyperparameters, discover best configurations, track model performance.

Bayesian optimisation
Uses statistical methods to intelligently suggest hyperparameter combinations, reducing the number of trials needed compared to grid or random search
Multi-metric tracking
Monitor multiple performance metrics simultaneously, not just accuracy or loss, to balance competing objectives
Experiment history and dashboards
View all past optimisation runs in one place, compare configurations, and track performance trends over time
API integration
Works with Python, Java, and other languages through a REST API, fitting into existing training pipelines
Parameter constraints and conditions
Set rules for which hyperparameter combinations are valid, and establish dependencies between settings
Model insights
Analyse which hyperparameters have the most influence on your model's performance
Optimising deep learning models where training is expensive and hyperparameter choices significantly affect accuracy
Finding the best learning rate, batch size, and regularisation settings for neural networks
Tuning gradient boosting models by optimising tree depth, learning rate, and other key parameters
Comparing multiple model architectures and configurations in a structured, tracked way
Managing hyperparameter optimisation across a team, allowing collaboration and reproducibility