I am using Falcor's Python API to implement 3d Gaussian Splatting. In my implementation, some buffers need to be recreated every frame.
However, there doesn't seem to be a corresponding Python API function to release the GPU memory associated with these buffers. GPU memory usage grows rapidly frame by frame.
I tried adding a corresponding API in C++ to manually release GPU memory, but it didn't work.