I have installed Context Chat, Context Chat Backend, Artificial Intelligence, and OpenAI and LocalAI integration. I’ve point this at an Ollama instance running externally.
I’ve confirmed this works. Now, I would like to interact with this using Context Chat.
But it seems like I can’t use my own Ollama instance to chat with my documents?
I cannot figure out how to change these two settings.
The context_chat and context_chat_backend apps will use the Free text-to-text task processing providers like OpenAI integration, LLM2, etc. and such a provider is required on a fresh install, or it can be configured to run open source models entirely on-premises. Nextcloud can provide customer support upon request, please talk to your account manager for the possibilities.
( App: Context Chat — Nextcloud latest Administration Manual latest documentation)
This means if your localAI instance is configured for text-to-text task it is also used by the context chat.
Thanks for that. My Artificial Intelligence “Free Text to Text” is set to LocalAI.
However, using the interface “Chat with AI” does not send anything to my configured OpenAI API settings. I can tell this two ways. 1) Nothing on my Ollama logs are registered when processing. 2) It returns the response in a different language.
When using “Context Chat” I see the job gets submitted, however then it processes for a very long time, and eventually fails. Again, nothing in the Ollama logs. (I am not sure where to look for the failure log info on the Nextcloud side yet)
When using “Work with images” I immediately see a call to the Ollama backend and returns a response within seconds. (I did set up a worker to pick these tasks up faster than the stock 5 minute interval)