From ff46889f5c5f4e0c8d06afb885f9df59339e846c Mon Sep 17 00:00:00 2001 From: AMATH <116212274+amathxbt@users.noreply.github.com> Date: Sat, 18 Apr 2026 09:48:49 +0100 Subject: [PATCH] Use LLM.from_url for llm_server_url in adapter Fixes the llm_server_url path in the LangChain adapter by using LLM.from_url when llm_server_url is provided. This prevents TypeError caused by passing llm_server_url to LLM.__init__(). Signed-off-by: AMATH <116212274+amathxbt@users.noreply.github.com> --- ...M.from_url when llm_server_url is provided | 23 +++++++++++++++++++ 1 file changed, 23 insertions(+) create mode 100644 fix(langchain): use LLM.from_url when llm_server_url is provided diff --git a/fix(langchain): use LLM.from_url when llm_server_url is provided b/fix(langchain): use LLM.from_url when llm_server_url is provided new file mode 100644 index 0000000..b8f19c4 --- /dev/null +++ b/fix(langchain): use LLM.from_url when llm_server_url is provided @@ -0,0 +1,23 @@ +## Summary + +Fixes the `llm_server_url` path in the LangChain adapter. + +When `llm_server_url` is provided, `OpenGradientChatModel` was passing it into `LLM(...)`, but `LLM.__init__()` does not accept that argument. This causes a runtime `TypeError` instead of creating a client bound to the provided TEE endpoint. + +This PR switches that path to use `LLM.from_url(private_key=..., llm_server_url=...)`, while preserving the existing registry-based `LLM(...)` behavior for the default case. + +Fixes #248 + +## Problem + +`OpenGradientChatModel` and `langchain_adapter()` accept `llm_server_url`, but the implementation forwarded it as a keyword argument to `LLM(...)`. + +That constructor only accepts: +- `private_key` +- `rpc_url` +- `tee_registry_address` + +As a result, any caller trying to use `llm_server_url` hits: + +```python +TypeError: __init__() got an unexpected keyword argument 'llm_server_url'