diff --git a/docs/inspector/test-cases.mdx b/docs/inspector/test-cases.mdx index ea5061ce8..cce2b2732 100644 --- a/docs/inspector/test-cases.mdx +++ b/docs/inspector/test-cases.mdx @@ -83,7 +83,7 @@ You can run tests in two ways: ## Analyzing Results -### Results & Runs View +### Suite Overview Click Results & Runs in the sidebar to see overall analytics: @@ -91,6 +91,8 @@ Click Results & Runs in the sidebar to see overall analytics: - **Accuracy Chart** - Shows pass rates across runs (line connects multiple runs) - **Performance by Model** - Bar chart comparing models +The suite overview provides three different views to analyze your test results from different angles. + -Use the dropdown to switch between "Runs" and "Test Cases" views: - -**Runs view:** -- **Run History** - Shows all your runs with their metrics (Run ID, Start time, Duration, Passed, Failed, Accuracy, Tokens) +Use the tabs to switch between "Test cases", "Executions", and "Runs" views: **Test Cases view:** - **Test Cases Table** - List of all tests with Test Case Name, Iterations, Avg Accuracy, Avg Duration +**Executions view:** +- **Execution Timeline** - Flat list of every test execution across all test cases, sorted by most recent +- Shows test case name, result (passed/failed/pending/cancelled), and timestamp +- Click any execution to view its details in the compare view + +**Runs view:** +- **Run History** - Shows all your runs with their metrics (Run ID, Start time, Duration, Passed, Failed, Accuracy, Tokens) + ### Run Detail View When you click on a run: