Nextcloud Assistant / LocalAI does not work properly

Hello there,

I am running NC 27.1.1 inside a docker container and am trying to integrate LocalAI into my Nextcloud using the OpenAI & LocalAI integration.

LocalAI is running in a separater container (called api), and it’s in the same network as the nextcloud container. When I start a bash from the nextcloud container, I can interact with the LocalAI container without any problems. The API is running fine and I can fire queries at it, that get answered. I also entered the correct API endpoint in the Nextcloud configuration. See screenshot:

As you can see, Nextcloud fetched the completion model that I set up in LocalAI.

To set up LocalAI, I used the following model: https://huggingface.co/TheBloke/GPT4All-13B-snoozy-GGML/resolve/main/GPT4All-13B-snoozy.ggmlv3.q4_0.bin

I placed the model in LocalAI’s models folder and created the following files, as stated in the LocalAI documentation.

hvgpt.yaml

backend: gpt4all-llama
context_size: 1024
f16: false ## If you are using cpu set this to false
name: hvgpt
parameters:
  model: GPT4All-13B-snoozy.ggmlv3.q4_0.bin
  temperature: 0.2
  top_k: 80
  top_p: 0.7
template:
  chat: hvgpt-chat
  completion: hvgpt-completion

hvgpt-completion.tmpl

The prompt below is a question to answer, a task to complete, or a conversation to respond to; decide which and write an appropriate response.
### Prompt:
{{.Input}}
### Response:

hvgpt-chat.tmpl:

Complete the prompt
### Prompt:
{{.Input}}
### Response:

When I open the Nextcloud Assistant and enter a prompt, the following happens inside my LocalAI container: 8:40PM DBG Request received:8:40PM DBG Configuration read: &{PredictionOptions - Pastebin.com

At the same time, the following appears in the Nextcloud log: {"reqId":"jiKxXxKPuNWGg829qQjx","level":2,"time":"2023-09-26T20:40:35+00:00","re - Pastebin.com

The entire operation errors out and my query remains unanswered :frowning: .

I haven’t found anybody online who already tried setting up LocalAI together with Nextcloud in this configuration, so I hope that maybe somebody here might have had luck with that, already. Did any of you try?

Here is an example of a working query that I fire from the nextcloud-container:

Thanks for any help & debugging!

Cheers
Phil

Hi!
Same issue here… I have tried countless things and followed all instructions given by LocalAI, but I can´t get it to work. I can choose the installed models, but all tasks fail. Using the cli works fine from any host in my network.

Hi, I always had an Error 500 in the LocalAI Container.

I run it on a seperate Cloud VM in an internal network.
UFW firewall settings were set to allow everything internal. Still error 500.

After completely disabling UFW on the LocalAI Docker Host it worked in Nextcloud :slight_smile: