Release v1.9-20260304 — Security Patch#206
Conversation
Merge tag 'v1.7-20241211'
…the tests to pass locally
Unit test updates
177: isApproved feature
feature addition: manual solicitation check
Increase disk_quota to 4096M for larger Docker image
More robust boolean handling
fix: update boolean handling to support both string and boolean compl…
security: fix npm audit vulnerabilities (57→10)
Finding #1 (Moderate): Add HSTS Strict-Transport-Security header - max-age=31536000 (1 year), includeSubDomains, preload Finding #3 (Low): Tighten CORS configuration - Reject CORS for requests with no Origin header in production - Remove unused dev/staging origins from shared CORSWhitelist Finding #4 (Low): Remove information disclosure headers - Strip Server and X-Powered-By headers - Add X-Content-Type-Options, X-Frame-Options, Referrer-Policy, Permissions-Policy security headers Ref: SRT Penetration Test Report v1 (March 2026) by Valiant Solutions
The apt-get install of cf8-cli was failing intermittently because the underlying GitHub release CDN returns 500 errors. This replaces the two-step Debian repo approach with a single direct binary download from packages.cloudfoundry.org with 3 retry attempts.
fix: replace flaky Debian CF CLI install with direct download + retry
The pen test remediation over-trimmed the CORSWhitelist, removing legitimate dev/staging/prod client URLs. The actual CORS fix was in the callback logic (rejecting undefined origins), not in removing real client origins.
…diate CVE-2026-2391 Addresses CISA KEV finding SNYK-JS-QS-15268416 (Allocation of Resources Without Limits or Throttling in qs >=6.7.0 <6.14.2).
fix(security): upgrade express to 4.22.1 and pin qs to 6.14.2
| router.post('/analyze-documents', express.json(), async (req, res) => { | ||
| try { | ||
| const documents = req.body.documents; | ||
| if (!documents || Object.keys(documents).length === 0) { | ||
| logger.warn('No document texts were provided.'); | ||
| return res.status(400).json({ error: 'No document texts provided' }); | ||
| } | ||
|
|
||
| logger.info(`Processing ${Object.keys(documents).length} document(s)...`); | ||
|
|
||
| // Create a JSON payload to send via STDIN | ||
| const inputData = JSON.stringify({ documents }); | ||
| logger.info(`Sending payload to Python: ${inputData}`); | ||
|
|
||
| // Spawn the Python process with piped stdio so we can write to STDIN and read STDOUT | ||
| const pythonProcess = spawn(PYTHON_PATH, SCRIPT_ARGS, { stdio: ['pipe', 'pipe', 'pipe'] }); | ||
|
|
||
| let stdout = ''; | ||
| let stderr = ''; | ||
|
|
||
| // Collect stdout data | ||
| pythonProcess.stdout.on('data', (data) => { | ||
| stdout += data.toString(); | ||
| }); | ||
|
|
||
| // Collect stderr data and log it | ||
| pythonProcess.stderr.on('data', (data) => { | ||
| stderr += data.toString(); | ||
| logger.error(`Python stderr: ${data.toString().trim()}`); | ||
| }); | ||
|
|
||
| pythonProcess.on('error', (error) => { | ||
| logger.error(`Failed to start Python process: ${error.message}`); | ||
| }); | ||
|
|
||
| // Write the JSON payload to the Python process's STDIN and close it | ||
| pythonProcess.stdin.write(inputData); | ||
| pythonProcess.stdin.end(); | ||
|
|
||
| // When the Python process closes, process the output | ||
| pythonProcess.on('close', (code) => { | ||
| logger.info(`Python process exited with code: ${code}`); | ||
| logger.info(`Raw Python output: ${stdout}`); | ||
|
|
||
| if (code !== 0) { | ||
| const errorMsg = stderr || `Python process exited with code ${code}`; | ||
| logger.error(`Python process failed: ${errorMsg}`); | ||
| return res.status(500).json({ error: errorMsg }); | ||
| } | ||
|
|
||
| let result; | ||
| try { | ||
| result = JSON.parse(stdout); | ||
| } catch (parseError) { | ||
| logger.error(`Failed to parse Python output: ${parseError.message}`); | ||
| logger.error(`Raw output: ${stdout}`); | ||
| return res.status(500).json({ error: 'Invalid response from Python process' }); | ||
| } | ||
|
|
||
| logger.info(`Parsed Python result: ${JSON.stringify(result)}`); | ||
|
|
||
| // Convert boolean predictions into "compliant" / "non-compliant" strings. | ||
| const transformed = {}; | ||
| for (const fname in result.predictions) { | ||
| const prediction = result.predictions[fname]; | ||
| logger.info(`Document: ${fname}, Prediction: ${prediction}`); | ||
| // Handle both boolean and string cases | ||
| const isCompliant = prediction === true || prediction === 'True' || prediction === 'compliant'; | ||
| transformed[fname] = isCompliant ? 'compliant' : 'non-compliant'; | ||
| } | ||
|
|
||
| logger.info(`Transformed result: ${JSON.stringify(transformed)}`); | ||
| res.json(transformed); | ||
| }); | ||
| } catch (error) { | ||
| logger.error(`Unexpected error during document analysis: ${error.message}`); | ||
| res.status(500).json({ error: 'An unexpected error occurred' }); | ||
| } | ||
| }); |
Check failure
Code scanning / CodeQL
Missing rate limiting High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 14 days ago
In general, expensive HTTP handlers that spawn subprocesses or perform heavy I/O should be protected with rate limiting middleware. In an Express app, a common approach is to use the express-rate-limit package to define a limiter (window size, maximum requests per window, response behavior) and apply it either globally or to individual sensitive routes.
For this file, the least invasive and clearest fix is to add a route-specific rate limiter middleware to /analyze-documents using express-rate-limit. We should (1) import express-rate-limit, (2) configure a limiter instance with reasonable defaults (e.g., limit requests per IP over a time window), and (3) insert that limiter as middleware in the router.post('/analyze-documents', ...) call, without changing the existing business logic. Concretely:
- At the top of
server/routes/document.routes.js, addconst rateLimit = require('express-rate-limit');. - Below the constants for
PYTHON_PATHandSCRIPT_ARGS, define a limiter such asconst analyzeDocumentsLimiter = rateLimit({ windowMs: 15 * 60 * 1000, max: 100, standardHeaders: true, legacyHeaders: false });. - Update the route to
router.post('/analyze-documents', analyzeDocumentsLimiter, express.json(), async (req, res) => { ... });so the limiter runs before the JSON parsing and subprocess spawn.
This keeps existing behavior intact while adding the required protection.
| @@ -2,14 +2,22 @@ | ||
| const router = express.Router(); | ||
| const { spawn } = require('child_process'); | ||
| const logger = require('../config/winston'); | ||
| const rateLimit = require('express-rate-limit'); | ||
|
|
||
| const PYTHON_PATH = 'python3'; | ||
| const SCRIPT_ARGS = ['-m', 'srt_ml.predict.analyze_text']; | ||
|
|
||
| const analyzeDocumentsLimiter = rateLimit({ | ||
| windowMs: 15 * 60 * 1000, // 15 minutes | ||
| max: 100, // limit each IP to 100 requests per windowMs | ||
| standardHeaders: true, // Return rate limit info in the `RateLimit-*` headers | ||
| legacyHeaders: false, // Disable the `X-RateLimit-*` headers | ||
| }); | ||
|
|
||
| logger.info(`PYTHON_PATH: ${PYTHON_PATH}`); | ||
| logger.info(`SCRIPT_ARGS: ${SCRIPT_ARGS.join(' ')}`); | ||
|
|
||
| router.post('/analyze-documents', express.json(), async (req, res) => { | ||
| router.post('/analyze-documents', analyzeDocumentsLimiter, express.json(), async (req, res) => { | ||
| try { | ||
| const documents = req.body.documents; | ||
| if (!documents || Object.keys(documents).length === 0) { |
| @@ -53,7 +53,8 @@ | ||
| "sequelize": "^6.37.3", | ||
| "sequelize-cli": "^6.6.1", | ||
| "umzug": "^2.3.0", | ||
| "winston": "^3.14.2" | ||
| "winston": "^3.14.2", | ||
| "express-rate-limit": "^8.2.1" | ||
| }, | ||
| "devDependencies": { | ||
| "cryptify": "^3.0.3", |
| Package | Version | Security advisories |
| express-rate-limit (npm) | 8.2.1 | None |
Release v1.9-20260304 — Security Patch
Security Fixes
qsto 6.14.2 via Express 4.22.1 (CISA KEV finding)Features