Skip to content

Conversation

@flixx
Copy link

@flixx flixx commented Jun 25, 2019

Hello.

We use openapi-core to validate our specs with the backend, so that we are sure that the documentation generated out of the specs is up-to-date.

However, so far, the validation only worked one-way:
Properties returned in the response of the backend that are missing in the specs generate an error. (good)
Properties in the specs that are not returned in the response of the backend generate no error unless they are required (bad).

One solution would be, of curse, to set all properties to required. However, we do not want to do that since it is 1. confusing in the documentation 2. a lot of work.

This PR introduces require_all_props
It can be set like this:

RequestValidator(spec, require_all_props=True)
# or
ResponseValidator(spec, require_all_props=True)

If it is set to true, it is simulated that all properties defined in the specs are required.

Felix

When setting the new flag `require_all_props` to true,
it is simulated that all properties in the specs are required.
prop_value = value[prop_name]
except KeyError:
if prop_name in all_req_props_names:
if (prop_name in all_req_props_names) or require_all_props:
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@reviewers This is the important line. All other changes are passing around the flag.

@codecov
Copy link

codecov bot commented Jun 25, 2019

Codecov Report

Merging #146 into master will increase coverage by 0.25%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #146      +/-   ##
==========================================
+ Coverage   96.19%   96.44%   +0.25%     
==========================================
  Files          58       58              
  Lines        1602     1605       +3     
==========================================
+ Hits         1541     1548       +7     
+ Misses         61       57       -4
Impacted Files Coverage Δ
openapi_core/schema/media_types/models.py 85.29% <100%> (ø) ⬆️
openapi_core/schema/schemas/models.py 96.67% <100%> (+0.01%) ⬆️
openapi_core/schema/parameters/models.py 96.82% <100%> (ø) ⬆️
openapi_core/validation/response/validators.py 100% <100%> (ø) ⬆️
openapi_core/schema/schemas/unmarshallers.py 100% <100%> (ø) ⬆️
openapi_core/validation/request/validators.py 100% <100%> (ø) ⬆️
openapi_core/compat.py 100% <0%> (+44.44%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 0e30b71...0b34ed1. Read the comment docs.

1 similar comment
@codecov
Copy link

codecov bot commented Jun 25, 2019

Codecov Report

Merging #146 into master will increase coverage by 0.25%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #146      +/-   ##
==========================================
+ Coverage   96.19%   96.44%   +0.25%     
==========================================
  Files          58       58              
  Lines        1602     1605       +3     
==========================================
+ Hits         1541     1548       +7     
+ Misses         61       57       -4
Impacted Files Coverage Δ
openapi_core/schema/media_types/models.py 85.29% <100%> (ø) ⬆️
openapi_core/schema/schemas/models.py 96.67% <100%> (+0.01%) ⬆️
openapi_core/schema/parameters/models.py 96.82% <100%> (ø) ⬆️
openapi_core/validation/response/validators.py 100% <100%> (ø) ⬆️
openapi_core/schema/schemas/unmarshallers.py 100% <100%> (ø) ⬆️
openapi_core/validation/request/validators.py 100% <100%> (ø) ⬆️
openapi_core/compat.py 100% <0%> (+44.44%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 0e30b71...0b34ed1. Read the comment docs.

for t, u in primitive_unmarshallers.items()
)

pass_defaults = lambda f: functools.partial(
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I needed to refactor that part since the parameters were not passed in right with that lambda function here.

@bjmc
Copy link
Contributor

bjmc commented Jun 30, 2019

Properties in the specs that are not returned in the response of the backend generate no error unless they are required (bad).

Could you explain this a little more? If those properties aren't required in the spec, then they are optional. Why is it bad to not return an error for optional properties being omitted? If all the properties are required, they should all be marked required in the spec.

Maybe there's something subtle I'm misunderstanding here.

@flixx
Copy link
Author

flixx commented Jul 8, 2019

Sure @bjmc @p1c2u:
We publish an API Documentation based on our OpenAPI Spec to our customers.
We want to ensure that the Documentation is always up-to-date.

In order to ensure that all fields that are in the Specs are in the response of our Backend, we would need to set every single property as required.

I think the term "required" does not really make sense when you talk about responses. In our API Documentation, we want to tell the API-User what properties are "Required" in the Request, yes, but naming something "required" in the Response, does not really make sense - our Backend always gives a Response with a complete list of all properties (some of them are nullable).

So, yes technically, it would be most right to always mark all fields in the Response Schemes as required. But practically, this is both a lot of work (it is easy to forget to add new properties to the required list) and strange for a reader of a documentation.

So I understand your point - but the changes I suggest here already helped us to make our API Documentation a lot better.

Felix

@spacether
Copy link

@flixx if this is always the case then why not add an automated postprocessing step that marks all response schema properties as required and generates the updated spec. That way it will always be done.

@Julius2342
Copy link

I would love this feature! Interestingly, the AI hallucinated exactly such a flag when I described this problem.

@flixx : Or did you find a workaround for this?

@spacether
Copy link

A property that will always be present and must always be present is required. Value nullability is a separate concept. If a value may be nullable then you should describe it as a list of allowed types or type string with nullable true etc, depending upon your openapi version which use different json schema versions.

@Julius2342
Copy link

Assume a Rest resource which provides CRUD for a resource. The resource payload is referenced via $ref: '#/components/schemas/my_Resource' . Some fields are not required esp when posting the resource.

When running all API tests, i want to verify if all fields are described within the documentation and all fields described within the documentation are present when retrieving the resource.

Using different schemas for retrieving and sending - with a different set of required fields would solve the problem - with the inherent problem, that descriptions become outdated or inaccurate.

@flixx
Copy link
Author

flixx commented Feb 4, 2026

@Julius2342 @spacether

Hello! Glad to see that we are not the only ones with the problem. We did not find a workaround for this yet. It's a keeps being quite an annoyance for us.

So still, we manually set all fields in the response as required manually in the specs. We do this to make our tests fail in case the fields are not returned. However it is confusing for the readers of the docs (which we generate with redocly), because all response fields are marked as required there.

A flag how I proposed it back in 2019 would still be awesome. However, at the moment I don't have capacity & knowledge to bring this PR into a mergeable state so it would be greatly appreciated if someone can pick this up and continue the work.

Felix

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants