Imgage generation with LocalAI results in server 500 error

Dear all,

I recently set up a virtual machine for Docker containers, i.e. especially for my brand new LocalAI Docker container.
I use Nextcloud 28.0.5.
I pointed OpenAI and LocalAI integration app towards it and use a custom model mastering german language.
So far, so good, it is able to write hilarious great rants on topics, text generation works in Nextcloud Assistant and it’s German is great.

The problem is image generation. It fails with a pop up: Assistant error.
Stablediffusion is already part of that LocalAI Docker Container.
It also fails when using a slash to get into Smart Picker (at least Smart Picker opens) and in Nextcloud Office, the Smart Picker Button doesn’t show any reaction at all.

The weird thing is that image generation also fails when using “Local image generation with Stable Diffusion”-app, where I downloaded the models via occ command.

What am I missing?
My machine is AVX2 capable, but not AVX512.

Two logs are generated while a failing image generation attempt:

[integration_openai] Warnung: API request error : Server error: `POST http://XXX.XXX.XXX.XXX:8080/v1/images/generations` resulted in a `500 Internal Server Error` response:
{"error":{"code":500,"message":"rpc error: code = Unavailable desc = error reading from server: EOF","type":""}}

	POST /ocs/v2.php/apps/assistant/api/v1/i/process_prompt
	von XXX.XXX.XXX.XXX von admin um 30.04.2024, 00:46:07

And again more brief:

[integration_openai] Warnung: OpenAI/LocalAI's text to image generation failed with: API request error: rpc error: code = Unavailable desc = error reading from server: EOF
	POST /ocs/v2.php/apps/assistant/api/v1/i/process_prompt
	von XXX.XXX.XXX.XXX von admin um 30.04.2024, 00:46:07

Some weeks back, it worked testwise with OpenAI’s DALL-E 2, but I can’t test it again as DALL-E 2 seems to be shut down for free users and DALL-E 3 is not part of anything free for testing any more.

And I want to fiddle that out and have an own AI. :wink:

Unfortunately, I don’t know how to read LocalAI’s logs as it’s locked into a Docker on a headless Debian LXC container.

Any help is greatly appreciated.

Have a good night!