Ollama integration problem

Hey All,

Probably I have a problem due the lack of experience, but I ran into a problem.
I’m trying to add Deepseek to Nextcloud as an experiment. I have Ollama up and running, and through OpenWebUI everything works.

When I try to integrate into Nexcloud I run into this problem:

It attaches the /models part to the address which ends in a 404 error.
image

This seems to be correct as this does not exist even if I try to access it locally:
the localhost:11434 is working, I get the “Ollama is running” page

localhost:11434/models is a 404 error.

I’m unsure what to do, I tried to google search, but didnt find anything useful.

I dont see an option in the WebGUI to change this nor in Ollama to “create” this path in any way.

According to their documentation here (ollama/docs/openai.md at main · ollama/ollama · GitHub) you need to add /v1 to the Service URL. The correct path should be /v1/models

I’m ashemed how blind I am… Thanks… Everything checks out now!

1 Like