feat: Support cert config openaicompatible models#143
feat: Support cert config openaicompatible models#143paul-cayet wants to merge 3 commits intomainfrom
Conversation
|
Internal regression failed: Build ID #380 |
|
Internal regression succeeded 🍏: Build ID #381 |
|
|
||
| ``OpenAiCompatibleConfig`` now supports optional ``key_file``, ``cert_file``, and ``ca_file`` | ||
| fields for HTTPS and mTLS connections to private endpoints. These fields are treated as | ||
| sensitive values and are exported as references rather than inlined secrets. |
There was a problem hiding this comment.
no need for the "rather than inlined secrets"
| assert llm_config.ca_file == "/etc/certs/ca.pem" | ||
|
|
||
|
|
||
| def test_openaicompatibleconfig_cannot_be_imported_without_the_required_api_key() -> None: |
There was a problem hiding this comment.
please change the name of this test, e.g. *_without_required_sensitive_fields
| assert new_llm_config == llm_config | ||
|
|
||
|
|
||
| def test_openaicompatibleconfig_can_be_imported_with_api_key_in_component_registry() -> None: |
There was a problem hiding this comment.
please change the name of this test, e.g. *can_be_imported_with_sensitive_fields_in_components_registry
There was a problem hiding this comment.
Can you add a test that imports a yaml without any cert fields, to check if everything still works as before?
There was a problem hiding this comment.
I think that as of now this should not be allowed (if version is 26.2.0)
|
do we have a how-to for LLMs where we can mention it? |
| certificate_path is not None | ||
| for certificate_path in (self.key_file, self.cert_file, self.ca_file) | ||
| ) | ||
| if agentspec_version < AgentSpecVersionEnum.v26_2_0 or not has_certificate_configuration: |
There was a problem hiding this comment.
Mhm why are we removing them also if the version is >=26.2.0, but they are None? I think this is against the current expectations, as the configuration is expected to be always complete at the moment.
There was a problem hiding this comment.
I think that as of now this should not be allowed (if version is 26.2.0)
| assert llm_config.api_key == "THIS_SECRET_IS_SAFELY_INLINED" | ||
|
|
||
|
|
||
| def test_configuration_new_certificate_sensitive_fields_and_old_version_26_1_0_cannot_be_loaded(): |
There was a problem hiding this comment.
Nit: I am not sure I would put this test here, it seems more related to the behavior of openai compatible model, rather then sensitive fields. And you have already a very similar test in the other file, so I would probably just drop this one.
This PR adds support for SSL certificate configuration for OpenAI-compatible LLMs.