Confirm this is an issue with the Python library and not an underlying OpenAI API
Describe the bug
Using the AsyncResponses client to parse structured outputs there are pydantic models left behind which causes memory to accumulate. Flame graph of the memory leak:
To Reproduce
- Use the Async Responses
.parse() API with a Pydantic model
- Send large Base 64 Images in the messages using data URIs
- The memory is not freed
Code snippets
OS
macOS
Python version
Python 13.7
Library version
openai v2.31.0
Confirm this is an issue with the Python library and not an underlying OpenAI API
Describe the bug
Using the
AsyncResponsesclient to parse structured outputs there are pydantic models left behind which causes memory to accumulate. Flame graph of the memory leak:To Reproduce
.parse()API with a Pydantic modelCode snippets
OS
macOS
Python version
Python 13.7
Library version
openai v2.31.0