Ollama integration with Nexcloud

No luck here either with open-webui. (My workaround is to use LocalAI).

I have an open-webui chat working. I’m trying out the mistral-nemo model. Happy to share my Docker compose file if anyone wants it.

Nextcloud can’t use open-webui.

Starting at /settings/admin/ai in Nextcloud, I’m able to enter my open-webui server URL and API key. I then see a GET to /v1/models in the open-webui server log, but, confusingly, the XHR request from the Nextcloud AI settings page to /apps/integration_openai/models throws a HTTP 500 server error (in the browser console only – /settings/admin/ai doesn’t show an error). Maybe it doesn’t like the response from open-webui?

I’m able to try text generation and I see POSTs from Nextcloud to /v1/chat/completions when I do, but that endpoint always returns a HTTP 500 server error. open-webui says ERROR: Exception in ASGI application, a big stack trace, then IndexError: tuple index out of range.

These look relevant/related, or at least may contain hints on how we might get this to work: