Phoenix
Open-source tool for ML observability that runs in your notebook environment, by Arize. Monitor and fine-tune LLM, CV, and tabular models.
Open-source tool for ML observability that runs in your notebook environment, by Arize. Monitor and fine-tune LLM, CV, and tabular models.

LLM Tracing and Instrumentation
Capture detailed traces of LLM calls and interactions to understand model behaviour and identify bottlenecks
Notebook-based Evaluation
Run evaluations directly in Jupyter notebooks for smooth integration into existing ML workflows
Multi-model Support
Monitor and optimise LLMs, computer vision models, and tabular machine learning models from a single platform
Real-time Monitoring
Track model performance and behaviour in real-time with interactive dashboards and visualizations
Framework Agnostic
Works with any ML framework or LLM provider without requiring vendor-specific implementations
Experiment Tracking
Compare model versions and experiments to identify the best performing configurations
Debugging and tracing LLM application issues during development and testing phases
Evaluating and comparing different LLM models or prompts before production deployment
Monitoring model performance and detecting data drift in real-time for production systems
Optimizing computer vision and tabular model performance through detailed performance analysis
Educational use for learning about ML observability and model behaviour analysis