Files become zero bytes

Can you add this information to the github issue. Donā€™t forget to mention the version numbers of server and client. Since this happened recently, can you check your server logfiles to pin-point when this actually happened?

In your logfile it looks like that:

1.2.3.4 - user [12/May/2018:18:13:06 +0000] "PUT /owncloud/remote.php/dav/files/user/folder/file.txt HTTP/1.1" 204 795 "-" "Mozilla/5.0 (Linux) mirall/2.4.1 (build 9083)"

you can grep through your logs to find the files that were pushed with a size 0, for the given log me it would be something like that:

awk '$6=="\"PUT" && $10==0 {print $7}' access.log

(I havenā€™t seen any 0 files in the last weeks, so for me it originates probably from an older version as well.)

HI,

Yeah but as i Saidā€¦ Some Files were Just random. But Sonderzeichen (what is it in english?)

https://www.deepl.com/translator#de/en/Deutsche%20Sonderzeichen

https://www.deepl.com is your friend! :wink:

CUL8R
Django

1 Like

I never enter any Website except *.nextcloud.com :stuck_out_tongue:

But as youā€™re so familiar to it, from now on, I talk german and you all type it in deepl

1 Like

For what itā€™s worth, I see many 0 byte files on my system that are uploaded only by the Android Nextcloud app (fairly up to date). I can imagine instances where the image gets queued for upload, then the image is deleted before a wifi connection is available, so the upload fails but there is a 0 byte placeholder (just a theory, no evidence to back that up).

Other 0 byte files I see are .nomedia files and some various cache, which are legitimately 0 bytes.

Iā€™ve been using NC since the OC 8 days, with the old OC Windows/Linux desktop client and have not seen any 0 byte files coming from there interestingly enough.

How does the Nextcloud desktop client (or the Android app) check if an uploaded file is exactly the same as the file on the client?

Is there a CRC check?

It was that if you put a new file, it uploads a file and each file has a file-id. On the server you have the file-id in the main database and the client has a local one where changes are tracked and if there are updates, the files are transferred again.

ownCloud implemented a feature with checksum tests but I think it was only to verify that the transfer finished correctly. So you would still need to sync the files a first time, you canā€™t just put the file manually and the client would compare the checksums and say itā€™s the same file. Iā€™m not sure if this feature was backported to Nextcloud (it wasnā€™t available in NC 12 and there were issues when people migrated from oC) and I donā€™t know if this is planned. Also ownCloud is currently testing a delta-sync. Nextcloud was not very active in the desktop client development but this is changing a bit since they plan to implement client-side encryption: Desktop client has less releases and is behind master

As i understood your posting @tflidd, the nextcloud desktop sync client does not have a crc check and that is the reason why some files have a filesize of 0 bytes after the initial upload by the client.

After an initial upload they shouldnā€™t have the size 0 (unless the files are really empty). But if there is an error during the upload process that results in 0-sized files, that could be a reason, yes.

1 Like

Iā€™m also seeing lots of files initially show the correct size then revert to 0kb file sizes. Iā€™m not using the sync client but adding files directly using WebDav via MountainDuck. MountainDuck mounts resources as physical drives on your PC / Mac, and can even do this with Rackspace cloud files etc. So the issue is unlikely to be with MountainDuck but the NextCloud server itself.

MointainDuck seems to be a quite capable and convenient tool. But even if you donā€™t have anything to hide, isnā€™t it a problem, if metadata could be transferred to the proprietary and non-open-source software manufacturer? Because maximum autonomy is one one the great features of Nextcloud. On the other hand, Win and Mac are proprietary as well :slight_smile:

Itā€™s April 16, 2020 whit this thread being 2 years old and Iā€™m experiencing this problem on a fresh install with the latest client and server. Going to dig a bit further on this issue as this was the first result google returned for me. This is a HUGE issue and basically makes the product useless for production use. Way too risky to maybe have your files synced and accessible.

Sure it is a serious issue but not easy to track as you can see in this issue. There is a related bug report, so it is better to track it down with the developers because they eventually have to look into this: https://github.com/nextcloud/server/issues/3056#issuecomment-274770918

The latest comment seems to get them back on track of this issue.

I wasnā€™t really bothered by it since (at least on data actively used), but I have to investigate again and report back.

Actually, that is a very good starting point if we look for a way to reproduce this reliably. Many have older systems where itā€™s not really known when it happened, some things might date back to very old version etc. But the bug tracking itself should be on github, so if you can share more details there would be great.

Here are my observations from further testing. The NC client installed on my MacOS (latest version with security updates and Norton Internet security) incorrectly reported ā€œgreen check marksā€ in the Finder synce folder. Thinking it was finished, there was still transfer activity in the background that wasnā€™t obvious. Even down in the sub directories I was transferring, there was a green check mark telling me it was synced. I suspect the zero byte files are ā€œplace holdersā€ for the upload process similar when you have a *.part when downloading in Firefox.

Before I discovered the above, I copy/pasted some test folders that were previously zero bytes. They immediately became available and could be opened ā€“ much faster than a re-upload could provide. This seemed to kick off the rest of my transfers.

This issue (for me) seems to only happen when I try to transfer a larger number of files. 10GB of photos. So I echo what others have stated on this.

I setup a second NC server (Ubuntu 18.04 / latest stable NC) within my home LAN. I tested transferring the same data set. While I had different syncing issues, My data correctly synced in the background. (transferring pics from the iPhone photo gallery using the NC auto sync feature).

I suspect that the NC client is having issues with my basic Godaddy shared web hosting. I have the Linux Deluxe account with shell access with them.

Also noticed that in my bulk file transfer, the job returns an error in the client when it hits files ~25mb + ā€“ ā€œserver replied 503: service unavailableā€. My internet connection is rock solid and suspect that Godaddy shared hosting doesnā€™t like me syncing a large amount of files in this way as GD package I got wasnā€™t meant to be a dropbox like replacement. While the sync job reports as finished again by the NC client, it seems to still be going as I am getting MacBook popup notifications.

Update:
My initial / baseline sync seems to have completed. About 10 GB of photos and documents. I can now see the documents via web browser and the files are reporting sizes. I am almost certain that the issue I am facing and perhaps others is the way the local NC client is reporting job completion. At first, the NC client was reporting zero byte and job completed when hiding in the background it was still uploading files. Looking at the NC client folder list, they show a green check (like dropbox) when the folder is synced but the NC client has yet to upload the filesā€¦ so I assume itā€™s some sort of lock file during the sync/upload process. Now after a few days of running, the files seem to have synced and I can see them now checking in the browser for the images to load that Iā€™ve uploaded. So the files are there, but now the NC client is still reporting ā€œzero filesā€ beside the folder (for file count and not byte size this time) So as far as I see, the files are there and synced now, the client is not updated.

So all this to say, I believe the issue maybe two fold.

  1. My shared godaddy hosting account (linux deluxe with ssh) has some crappy performance and perhaps some throttling / resource limitations that is causing some slow connection issues that the NC client might be choking on.

  2. The way in which the NC client verifies / refreshes sync actions is not getting refreshed properly and/or in a way that is intuitive.

Iā€™m very very new to using NC so consider me looking at this issue with a brand new set up customer eyes that may reflect a larger new comer experience.

I absolutely love the NC product and itā€™s awesome that it is free. (thank you) I just hope the devā€™s take note of these issues and work out this bug. I saw the new release in beta advertised with an ubuntu 20.04 server. So I am excited to see what has been improved.

Devā€™s: feel free to reach out for more info, Iā€™d be happy to help your cause. Iā€™m an IT guy.

On hosted accounts there is sometimes some strange behaviour. If you set up a server yourself, you should check the configuration in the documentation to make sure everything runs properly and improved cache settings increase the sync performance dramatically. On shared hosting, you donā€™t have access to it at all and some configuration settings can harm the performance. And itā€™s hard/impossible to debug.

1 Like

Well, i have posted the reason of that bug on 2018-06-13 but it seems that they did not pay attention: Files getting set to 0 in data folder Ā· Issue #3056 Ā· nextcloud/server Ā· GitHub

:frowning:

And yes, i still have it as well sometimes.

I just hit the same issue, latest NextCloud running in Docker, with the latest Windows client.
I transferred a bunch of larger files amounting to ~130GB in total, and 2 of the larger files are 0 bytes. I got a bunch of 503 errors during upload in the client, so it restarted, 503 again, restarted, and so on for several hours. Finally finished, and now Iā€™m left with 0 byte files.

Iā€™ll mark this as solved in the forum. There seems to be a bug somewhere and it has to be discussed with the developers how to find and fix this issue. Please contribute there if you can:

Could be related to Object Storage being on S3:

Hi,
I realize I have the same problem. Some files are ok but some have written kb, but in fact are corrupted (in fact empty or un-openable in software).
It seems on github issues itā€™s related to 504 error during upload, which I experimented a lot. In the meantime developpers solved the problem, does it have a way to force re-sync all files from client without unlink everything and restart from scratch? BTW, just clicking ā€œforce synchronizationā€ doesnā€™t work, since NC thinks all the files are correctly synced to the server.
Thanks