Just downloaded, v 3.14.19
Installed in vscode. Selected Local. Selected a model. Nothing happened. 'Change connection settings' to the standard Ollama port, http://localhost:11434/ , I click connect and I get the small bubble at the top saying connected. Then the overlay just flashes back on. Some interesting things..
http://localhost:11434/
http://localhost:11434/v1
http://localhost:11434/api
all connect.
If I restart the extension, everything is fine for a second or two with the correct model showing at the bottom left .. then it instantly flashes to Connect your API KEY and the bottom left model is turned into Anthropic.