-
Notifications
You must be signed in to change notification settings - Fork 35
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Summary
Since pet can be run as a JSONRPC server (via pet server), we should have dedicated end-to-end performance tests that exercise the server from a client perspective and measure discovery latency across platforms.
Current State
What exists:
- CI integration tests in
.github/workflows/pr-check.ymlacross Windows, macOS (x86_64/aarch64), and Linux - Discovery validation tests in
crates/pet/tests/ci_test.rsthat verify discovered environments are correct - Sample JSONRPC client in
docs/sample.jsdemonstrating client connection - Telemetry infrastructure (
RefreshPerformancestruct) that tracks timing breakdown by locator
What's missing:
The existing tests validate correctness but don't systematically test performance from a client perspective.
Proposed E2E Performance Tests
Test Categories
-
Server Cold Start Performance
- Measure time from process spawn to first successful JSONRPC response
- Test with various
configurepayloads (empty workspace, large workspace, many environment directories)
-
Discovery Latency Tests
- Full machine scan (
refreshwith no filters) - Workspace-scoped discovery (
refreshwithsearchPaths) - Kind-specific discovery (
refreshwithsearchKind) - Measure and assert on P50/P95/P99 latencies
- Full machine scan (
-
Resolve Performance
- Cold resolve (no cache)
- Warm resolve (with
cacheDirectorypopulated) - Batch resolution patterns
-
Concurrent Request Handling
- Multiple
refreshrequests (should serialize per existingREFRESH_LOCK) - Multiple
resolverequests in parallel - Mixed workload patterns
- Multiple
-
Platform-Specific Baselines
- Windows (x86_64, aarch64)
- macOS (x86_64, aarch64)
- Linux (x86_64, musl, gnu)
Implementation Options
| Approach | Pros | Cons |
|---|---|---|
| Rust integration tests | Native, can share code with existing tests | Harder to simulate realistic JSONRPC client |
| Node.js test harness | Matches VS Code Python extension usage pattern | Separate test infrastructure |
| Criterion benchmarks | Rust-native, regression detection built-in | Doesn't test full server spawn path |
| Hyperfine CLI benchmarks | Simple, cross-platform | Limited to CLI, not JSONRPC server |
Suggested Metrics
- Discovery duration (returned in
RefreshResult.duration) - Time-to-first-environment (elapsed time until first
environmentnotification) - Total environments discovered per unit time
- Server startup latency
- Memory usage during discovery (if feasible)
CI Integration
- Add performance test job to
pr-check.ymlor separate workflow - Store results as artifacts for comparison
- Consider GitHub Actions benchmark action for regression detection
- Run on representative matrices (at minimum: Windows, Ubuntu, macOS)
Acceptance Criteria
- E2E tests that spawn
pet serverand communicate via JSONRPC - Performance baselines established for each platform
- CI job that runs performance tests and detects regressions
- Documentation for running performance tests locally
References
- JSONRPC API spec:
docs/JSONRPC.md - Sample client:
docs/sample.js - Existing tests:
crates/pet/tests/ci_test.rs - Performance telemetry:
crates/pet-core/src/telemetry/refresh_performance.rs
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request