Monitor model performance improvements with Test Runs
Test Runs allow you to validate model configurations against your Test Sets to track improvements at the Intent level. While Maitai automatically executes Test Runs whenever a new model is fine-tuned, you can also manually create Test Runs to evaluate different models or configuration changes.
The Test Set Overview page displays the Pass Rate for each Test Run - this represents the percentage of requests that scored “satisfactory” or higher (3/5 or better).