Skip to content

feat: verify S220 @g5n-dev bounty — NO verdict, lru_cache claim incorrect (#534)#345

Open
xliry wants to merge 4 commits intopeteromallet:mainfrom
xliry:task-534-lota-1
Open

feat: verify S220 @g5n-dev bounty — NO verdict, lru_cache claim incorrect (#534)#345
xliry wants to merge 4 commits intopeteromallet:mainfrom
xliry:task-534-lota-1

Conversation

@xliry
Copy link

@xliry xliry commented Mar 7, 2026

Issue: #204
Submission: #204 (comment)
Author: @g5n-dev

Problem (in our own words)

Submission claims cli.py lines 56-78 have a threading hazard due to @lru_cache not being thread-safe, entangled global state from three mechanisms (_DETECTOR_NAMES_CACHE, @lru_cache, on_detector_registered callback), unnecessary cache invalidation, and test pollution via callback accumulation.

Evidence

  • desloppify/cli.py:49-68 (at commit 6eb2065): _DETECTOR_NAMES_CACHE compat shim, @lru_cache on _get_detector_names_cached(), _invalidate_detector_names_cache() with on_detector_registered callback
  • desloppify/base/registry.py:413-431 (at commit 6eb2065): on_detector_registered() appends to _RUNTIME.callbacks; register_detector() adds to _RUNTIME.detectors and fires all callbacks
  • CPython functools.lru_cache implementation uses an internal lock (RLock in pure Python, C-level lock since Python 3.8), making it thread-safe for concurrent reads/writes including cache_clear()
  • Every call to register_detector() adds a new detector name to _RUNTIME.detectors, so cache invalidation on registration is correct and necessary

Fix

No fix needed — verdict is NO

Verdict

Question Answer Reasoning
Is this poor engineering? NO The central claim that @lru_cache is thread-unsafe is factually incorrect; CPython's implementation uses a lock
Is this at least somewhat significant? NO The code is a CLI entry point, not a web server; threading concerns are theoretical and based on wrong premises

Final verdict: NO

Scores

Criterion Score
Significance 2/10
Originality 3/10
Core Impact 1/10
Overall 2/10

Summary

The submission's central technical claim — that @lru_cache is not thread-safe — is factually incorrect. CPython's lru_cache uses an internal lock and is thread-safe. The cache invalidation logic is correct since every register_detector() call adds a new detector name. The _DetectorNamesCacheCompat is explicitly labeled a test compat shim. This is a CLI entry point where multi-threading concerns are theoretical and inapplicable.

Why Desloppify Missed This

  • What should catch: A "global mutable state" or "threading hazard" detector
  • Why not caught: The code is actually correct — there is no real threading hazard to catch. The compat shim redundancy is minor.
  • What could catch: A redundant-caching detector could flag the dual _DETECTOR_NAMES_CACHE + @lru_cache pattern, but this is explicitly labeled as temporary test compatibility.

Verdict Files

Generated with Lota

xliry and others added 4 commits March 7, 2026 03:58
… (#451)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…eld confirmed (#456)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…ty claim incorrect (#534)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant