Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 0 additions & 3 deletions data/dailySnapshots/2026-03-21.json

This file was deleted.

7 changes: 0 additions & 7 deletions data/dailySnapshots/2026-03-27.json

This file was deleted.

5 changes: 0 additions & 5 deletions data/dailySnapshots/2026-04-17.json

This file was deleted.

11 changes: 0 additions & 11 deletions data/dailySnapshots/2026-04-19.json

This file was deleted.

68 changes: 0 additions & 68 deletions data/modelWeights.json

This file was deleted.

106 changes: 106 additions & 0 deletions tests/dailyRefinement.system.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,106 @@
import * as sqlite3 from "sqlite3";
import * as path from "path";
import * as fs from "fs";

// Mock DB_PATH
const TEST_DB_PATH = path.join(__dirname, "system_test_issues.db");
const TEST_WEIGHTS_PATH = path.join(__dirname, "../data/modelWeights.json");
const TEST_SNAPSHOTS_DIR = path.join(__dirname, "../data/dailySnapshots");

Comment on lines +5 to +9
process.env.DB_PATH = TEST_DB_PATH;

import { DailyRefinementJob } from "../scheduler/dailyRefinementJob";
Comment on lines +10 to +12
Comment on lines +6 to +12
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Use a unique DB file for this test run.

A fixed tests/system_test_issues.db makes reruns and parallel workers trample the same SQLite file, which can leave stale tables behind and make the setup fail intermittently. Please move this to a per-run temp file and set DB_PATH inside the setup hook.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/dailyRefinement.system.test.ts` around lines 6 - 12, Replace the fixed
TEST_DB_PATH with a per-run temporary file and set process.env.DB_PATH inside
the test setup hook so each run gets its own SQLite file; specifically, change
the constant TEST_DB_PATH creation to use a temp filename (e.g., via fs.mkdtemp
or os.tmpdir with a randomized suffix) and move the assignment
process.env.DB_PATH = TEST_DB_PATH into a beforeEach or beforeAll setup block
that runs prior to constructing/using DailyRefinementJob, ensuring teardown
removes the temp file after tests complete to avoid leftover tables.


describe("DailyRefinementJob System Tests", () => {
let db: sqlite3.Database;
let job: DailyRefinementJob;

beforeAll((done) => {
// Ensure test directories exist
if (!fs.existsSync(path.join(__dirname, "../data"))) {
fs.mkdirSync(path.join(__dirname, "../data"), { recursive: true });
}
if (!fs.existsSync(TEST_SNAPSHOTS_DIR)) {
fs.mkdirSync(TEST_SNAPSHOTS_DIR, { recursive: true });
}

db = new sqlite3.Database(TEST_DB_PATH, (err) => {
if (err) return done(err);

db.run(
`CREATE TABLE issues (
id INTEGER PRIMARY KEY,
description TEXT,
category TEXT,
status TEXT,
created_at DATETIME,
upvotes INTEGER,
latitude REAL,
longitude REAL
)`,
(err) => {
if (err) return done(err);

// Insert dummy data
db.run(
`INSERT INTO issues (description, category, status, created_at, latitude, longitude) VALUES
('huge pothole', 'Pothole', 'resolved', datetime('now', '-2 hours'), 19.0, 72.8),
('pothole again', 'Pothole', 'resolved', datetime('now', '-3 hours'), 19.0, 72.8),
('garbage everywhere', 'Garbage', 'open', datetime('now', '-4 hours'), 19.1, 72.9),
('pothole fixed', 'Pothole', 'resolved', datetime('now', '-5 hours'), 19.0, 72.8),
('pothole very bad', 'Pothole', 'resolved', datetime('now', '-1 hours'), 19.0, 72.8),
('pothole dangerous', 'Pothole', 'resolved', datetime('now', '-6 hours'), 19.0, 72.8),
('pothole resolved', 'Pothole', 'resolved', datetime('now', '-10 hours'), 19.0, 72.8)
`,
(err) => {
if (err) return done(err);
job = new DailyRefinementJob();
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: Close the DailyRefinementJob SQLite connection as part of cleanup; this suite leaves an extra sqlite3 handle open.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At tests/dailyRefinement.system.test.ts, line 57:

<comment>Close the DailyRefinementJob SQLite connection as part of cleanup; this suite leaves an extra sqlite3 handle open.</comment>

<file context>
@@ -0,0 +1,106 @@
+            `,
+            (err) => {
+              if (err) return done(err);
+              job = new DailyRefinementJob();
+              done();
+            }
</file context>

done();
}
);
}
);
});
});

afterAll((done) => {
db.close((err) => {
if (fs.existsSync(TEST_DB_PATH)) fs.unlinkSync(TEST_DB_PATH);
if (fs.existsSync(TEST_WEIGHTS_PATH)) fs.unlinkSync(TEST_WEIGHTS_PATH);
// Clean up snapshots created today
const files = fs.readdirSync(TEST_SNAPSHOTS_DIR);
for (const file of files) {
fs.unlinkSync(path.join(TEST_SNAPSHOTS_DIR, file));
}
done();
Comment on lines +69 to +75
});
Comment on lines +67 to +76
});

test("runRefinement should execute successfully and update weights and snapshots", async () => {
// Before run, capture initial state if any
const initialSnapshotsCount = fs.readdirSync(TEST_SNAPSHOTS_DIR).length;

await job.runRefinement();

// Verify weights updated
expect(fs.existsSync(TEST_WEIGHTS_PATH)).toBe(true);
const weights = JSON.parse(fs.readFileSync(TEST_WEIGHTS_PATH, "utf8"));
// Since we inserted 6 resolved potholes, it should boost the weight for Pothole
expect(weights.categoryWeights.Pothole).toBeGreaterThan(1.0); // Assuming 1.0 is default, might be different. Let's just check it exists.
expect(weights.categoryWeights.Pothole).toBeDefined();
Comment on lines +88 to +90
Comment on lines +85 to +90
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Strengthen the weight assertion.

> 1.0 is too loose here; it would still pass even if the optimizer never bumped Pothole off the default value. Assert the expected increase from this fixture instead, or at least compare against the current baseline.

Suggested adjustment
-    expect(weights.categoryWeights.Pothole).toBeGreaterThan(1.0);
-    expect(weights.categoryWeights.Pothole).toBeDefined();
+    expect(weights.categoryWeights.Pothole).toBeGreaterThan(5.0);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/dailyRefinement.system.test.ts` around lines 85 - 90, The test's
assertion for weights.categoryWeights.Pothole is too weak; instead of just >
1.0, load the baseline/default weight (or previous weights snapshot) and assert
that weights.categoryWeights.Pothole is greater than that baseline by a minimal
delta or equals the exact expected boosted value from this fixture; locate the
test block that reads TEST_WEIGHTS_PATH and replace the loose
expect(weights.categoryWeights.Pothole).toBeGreaterThan(1.0) with an assertion
comparing against the baseline default (e.g.,
baselineWeights.categoryWeights.Pothole) or a specific expected value, ensuring
you still check weights.categoryWeights.Pothole isDefined.


// Verify snapshot generated
const newSnapshotsCount = fs.readdirSync(TEST_SNAPSHOTS_DIR).length;
expect(newSnapshotsCount).toBe(initialSnapshotsCount + 1);

Comment on lines +80 to +95
const snapshotFiles = fs.readdirSync(TEST_SNAPSHOTS_DIR);
const latestSnapshotFile = snapshotFiles[snapshotFiles.length - 1];
const snapshotData = JSON.parse(fs.readFileSync(path.join(TEST_SNAPSHOTS_DIR, latestSnapshotFile), "utf8"));
Comment on lines +96 to +98
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Sort snapshot filenames before picking the newest one.

readdirSync() does not guarantee order, so snapshotFiles[snapshotFiles.length - 1] can point at the wrong file on some filesystems. Sort the JSON filenames first, then take the last entry.

Suggested adjustment
-    const snapshotFiles = fs.readdirSync(TEST_SNAPSHOTS_DIR);
+    const snapshotFiles = fs
+      .readdirSync(TEST_SNAPSHOTS_DIR)
+      .filter((file) => file.endsWith(".json"))
+      .sort();
     const latestSnapshotFile = snapshotFiles[snapshotFiles.length - 1];
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const snapshotFiles = fs.readdirSync(TEST_SNAPSHOTS_DIR);
const latestSnapshotFile = snapshotFiles[snapshotFiles.length - 1];
const snapshotData = JSON.parse(fs.readFileSync(path.join(TEST_SNAPSHOTS_DIR, latestSnapshotFile), "utf8"));
const snapshotFiles = fs
.readdirSync(TEST_SNAPSHOTS_DIR)
.filter((file) => file.endsWith(".json"))
.sort();
const latestSnapshotFile = snapshotFiles[snapshotFiles.length - 1];
const snapshotData = JSON.parse(fs.readFileSync(path.join(TEST_SNAPSHOTS_DIR, latestSnapshotFile), "utf8"));
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/dailyRefinement.system.test.ts` around lines 96 - 98, snapshotFiles
coming from fs.readdirSync(TEST_SNAPSHOTS_DIR) is unordered; filter for JSON
snapshot files, sort the resulting array (e.g., snapshotFiles.filter(f =>
f.endsWith('.json')).sort()), then set latestSnapshotFile to the last element of
that sorted array before reading it with fs.readFileSync; update the variables
snapshotFiles and latestSnapshotFile accordingly (references: snapshotFiles,
latestSnapshotFile, TEST_SNAPSHOTS_DIR, readdirSync).


Comment on lines +96 to +99
expect(snapshotData.indexScore).toBeDefined();
expect(snapshotData.emergingConcerns).toBeDefined();
// Pothole has a spike because there are 6 vs 0 in previous 24h
expect(snapshotData.emergingConcerns.length).toBeGreaterThan(0);
expect(snapshotData.emergingConcerns[0].category).toBe("Pothole");
});
});
Loading