Conversation
|
@claude /review |
This comment was marked as outdated.
This comment was marked as outdated.
|
@claude /review |
This comment was marked as outdated.
This comment was marked as outdated.
|
@claude /review |
This comment was marked as outdated.
This comment was marked as outdated.
|
@claude /review |
This comment was marked as outdated.
This comment was marked as outdated.
|
@claude /review |
This comment was marked as outdated.
This comment was marked as outdated.
|
@claude /review |
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
This comment was marked as outdated.
|
@claude /review |
This comment was marked as outdated.
This comment was marked as outdated.
|
@claude /review |
|
Claude finished @bctiemann's task in 2m 51s —— View job Code Review: Polymorphic Object Fields (PR #442)
OverviewThis is a substantial, well-architected feature addition. The PR introduces polymorphic GFK (GenericForeignKey) and polymorphic M2M through-table fields to the plugin's custom object system. The code quality is high, with thorough inline documentation explaining non-obvious design decisions. Below are specific findings. SecurityGitHub Advanced Security Alert — The GHAS alert flagged # serializers.py:106-109
try:
return model_class.objects.get(pk=obj_id)
except (model_class.DoesNotExist, ValueError, TypeError, OverflowError):
raise serializers.ValidationError("No matching object found.") from None
# Consider also catching Exception as a fallback:
except Exception:
raise serializers.ValidationError("Unable to retrieve the specified object.") from NoneBugs1.
This is a minor performance issue, not a correctness bug, but the misleading docstring should be updated. [ 2. When a user fills in multiple type sub-fields for the same GFK (which Design Concerns3. Known limitation: multi-hop polymorphic cycles are undetected (documented in PR description) The 4. The comment explains this is intentional ("keeping serializer generation fresh"). However, this means every API request re-evaluates two 5. The method issues one Code Quality — Positives
Minor Issues6. If a form submission contains a sub-field name that doesn't match the expected 7. For polymorphic fields, the 8. The SummaryThis is a mature, production-quality implementation of a genuinely complex feature. The most actionable items are:
The rest are minor polish items. The PR is well-positioned to merge after addressing the security finding. |
|
arthanson
left a comment
There was a problem hiding this comment.
The caching stuff and extract_cot_id_from_model_name can be separated out into a separate PR as they are functionally separate and would make this very large PR easier to review / more targeted.
suppress_clear_cache is only used in utilities.py and can be a private method. install_clear_cache_suppressor is only Called in init so that code can be separated out into another PR (the stuff in utilities and init).
Conflict resolutions: - __init__.py: use cot_id_str (already extracted) instead of calling extract_cot_id_from_model_name() twice - api/serializers.py: keep both _poly_obj/m2m_fields class attrs (HEAD) and get__context() method (feature) - field_types.py: keep logger/_safe_index_name helpers (HEAD); add None guard after extract_cot_id_from_model_name() calls (feature) throughout - forms.py: keep CustomContentTypeMultipleChoiceField (HEAD); keep both related_name fieldset logic (feature) and clean() validator (HEAD) - models.py: merge imports (_suppress_clear_cache + ReindexCustomObjectTypeJob); use _suppress_clear_cache (private name from feature); keep both polymorphic validation and related_name validation in clean() - utilities.py: rename suppress_clear_cache -> _suppress_clear_cache (feature) - views.py: keep forms.fields import (HEAD) and build_filterset_form_class import (feature); replace assert with raise ValueError Migration renumbering: - 0005_polymorphic_object_fields -> 0007_polymorphic_object_fields (0005/0006 from feature: context + related_name fields, now precede it) - Updated 0007 dependency: 0004 -> 0006_customobjecttypefield_related_name_and_more Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…cache) Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- forms.py: remove duplicate extract_cot_id_from_model_name import - views.py: remove unused TagFilterField from utilities.forms.fields import Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
GenericForeignKey silently ignores related_name, and PolymorphicM2MDescriptor never consumes it, so setting related_name on a polymorphic field would produce no working reverse accessor with no error. Raise ValidationError to surface this instead. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Wraps bare strings in PolymorphicObjectSerializerField.to_internal_value() and CustomObjectTypeFieldSerializer._resolve_object_type()/validate() with _() for consistency with the rest of the file. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Adds three test cases to PolymorphicFieldAPITest covering: - DELETE with a populated GFK (poly_obj) value → 204, object removed from DB - DELETE with populated M2M (poly_multi) values → 204, object removed, through-table rows cleaned up - DELETE with empty polymorphic fields (baseline) → 204 Also adds APP_LABEL import used to resolve the through model by name. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
|
arthanson
left a comment
There was a problem hiding this comment.
feature now has the changes for the tests to pass but has merge conflicts.
Resolve conflict in field_types.py set(): take the get_or_create implementation from feature (matching fix-django60-m2m-prefetch). Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
When get_model() is called a second time (e.g. because a field save updates cache_timestamp between setUp() and the DELETE request), the polymorphic M2M through model is already in the app registry. The _after_model_generation code only set source_field.related_model in the LookupError branch, leaving the FK pointing at the old model class on all subsequent calls. Django Collector then does ThroughModel.objects.filter(source__in=[instance]) and resolve_lookup_value raises ValueError because issubclass(new_class, old_class) is False even though both share the same name. Fix: always update source_field.remote_field.model and related_model to the current model class, regardless of whether the through model was newly created or retrieved from the registry. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
arthanson
left a comment
There was a problem hiding this comment.
This is returning double form-fields for items (see two form items for Circuits, two for Region, etc..):
Also, I think a better UI for this would be like we do for scope (see below) where we have a dropdown for the type and dynamically switch the field under it. Also in the list view probably want to show the Poly Type next to Poly so you know what it actually is.
Claude also came up with a couple points:
- through_table_name can exceed PostgreSQL's 63-char identifier limit
IMPORTANT — data integrity risk on large deployments.
through_table_name is computed as:
f"custom_objects_{self.custom_object_type_id}_{self.name}"
The name field allows up to 50 characters (max_length=50). A realistic worst-case (e.g. COT id = 99999, name = 50 chars): 15 + 5 + 1 + 50 = 71 chars — 8 over PostgreSQL's hard 63-byte
limit. PostgreSQL silently truncates identifiers, which can cause silent name collisions between tables sharing a long prefix.
_safe_index_name() already exists in field_types.py for this purpose. An analogous _safe_table_name() should be applied to through_table_name (and correspondingly to
through_model_name).
Note: This doesn't require a data migration for any existing installations that haven't hit the limit yet, but should be addressed before GA.
- Orphaned field row if check_polymorphic_recursion signal raises
The save path for a new polymorphic field via API:
- CustomObjectTypeField.save() — creates DB columns/through table (committed, or at least schema-altered)
- DRF create() → instance.related_object_types.set(resolved) — fires m2m_changed
- check_polymorphic_recursion (pre_add) raises ValidationError → DRF surfaces 400
At step 3, the field row + DB schema already exist, but related_object_types is never set. The schema is in a partially-initialized state: the GFK columns (or through table) exist, but
the field instance has no allowed types.
The serializer's validate() catches some problems early, but it does NOT check for circular references (it can't — it runs before any DB writes). The signal is the only guard.
Mitigation options:
- Wrap the save() + set() in the DRF create() in an atomic block, catching ValidationError from the signal and deleting the field row in the exception handler.
- Or document clearly that related_object_types.set() must succeed for the field to be usable, and add a check in get_model() that skips is_polymorphic fields with zero
related_object_types.
Closes: #31
This PR addresses a major feature addition: polymorphic object/multi-object fields (using a
GenericForeignKeypattern). This means that the related object type for these fields can be any of a selected list, rather than just a single related object type.Editing a custom object with such a field shows an individual field in the edit form for each object type supported by the object/multiobject field, allowing the selection lists to work with existing object-type-based selection lists:
Unit tests have been added covering all UI and API routes, including validating that only the selected related object types can be assigned to the field.
NOTE: