Will you add support for oobabooga's text-generation-webui? An llm initialization for post requests and a few patterns might be sufficient. I've been trying to do it, but I've had to try to figure out what's being sent from the llms/initialize.py, and langchain and - was textgen involved? I have a hackneyed version now, but it doesn't support streaming and just isn't proper.