Desktop client 3.4.0 destroys local time stamp and keeps uploading data to server

Yeah, we have done that in the past and might do it again, letā€™s see. It is also nice to be able to say ā€œand you can get it NOWā€ :wink:

anyhow, stay tuned - a test version of 3.4.1 will come soon.

1 Like

@jospoortvliet Thanks for the update and the explanation!

To answer your question about how to make more people test the RC, a few ideas:

  1. Provide a test server, including dummy files, folders, ā€¦ so that any tester can log in, install the client and start testing, without having the risk to lose any data (similar to what @AndyXheli proposes)
  2. I like the ideas above to communicate about RCs and new feature, to make it more appealing
  3. Propose a bonus to people who report feedback (notably the ones finding bugs). Bonus could be ā€œ2 hours of consulting/advice/audit provided by a Nextcloud engineerā€ (so that the tester receives advice on their own Nextcloud setup for example), a higher priority given to features suggested by the tester, ā€¦
  4. At each release, remind to the community that testing RCs is super important and that Nextcloud needs it

Regarding 3.4.1, do you plan to release a RC that we could test? [edit, oups, yes, you wrote it above :blush: ]

In general, I think this question is key and would deserve a dedicated discussion, with a lot of publicity so that the community gets really involved.

2 Likes

Just for info, 3.4.0 is still the version proposed in the stable Ubuntu PPA. https://launchpad.net/~nextcloud-devs/+archive/ubuntu/client

1 Like

Yes - and been working well here (Kubuntu 21.10). Maybe itā€™s a Windows related bug?

Hi,

A 3.4.1RC is here for testing. We improved the reliability of the sync process to better deal with the bug and it seems to fix it. Of course I still should point out that this is a test version, you should backup your data etc!

That said, the more realistic the test case, the more useful the testing is of courseā€¦ As usual, any feedback is welcome below here!

:apple: Mac OS X: https://cloud.nextcloud.com/s/Lp6kJ46KPdQLtFD
:penguin: Linux: https://cloud.nextcloud.com/s/3RY5FRxKymfYDBe
šŸŖŸ Windows: https://cloud.nextcloud.com/s/29zWbXKdTRbLey3

Edit:
Another update which should fix problems with weird dates.

šŸŖŸ Windows: https://cloud.nextcloud.com/s/b6wdJktaP9PtnHN
:penguin: Linux: https://cloud.nextcloud.com/s/WH6LmjDoxaYW6by
:apple: Mac OS X: https://cloud.nextcloud.com/s/6bCAQFT8cX8Z7wS

1 Like

Almost nobody at Nextcloud runs Windows and we had the problem ourselves too (we dogfood!) so Iā€™m reasonably sure itā€™s not Windows specific :wink:

And yes, the test version aboveā€™s been tested by us internally already.

Just a clarification - this only happens in conjunction with NC23, correct?

1 Like

I read some posts that this issue also occurs with 22.2.3 - i am running 21.0.7 and canā€™t see any problems ā€¦

You may look here: 3.4.0 keep synchronizing files after upgrade Ā· Issue #4016 Ā· nextcloud/desktop Ā· GitHub or here 3.4.0 keep synchronizing files after upgrade Ā· Issue #4016 Ā· nextcloud/desktop Ā· GitHub

Just upgrade from 3.3.6 to 3.4.1 on Win11 and so far no sync issues running

Thanks Jos to ā€¦

  • confirm itā€™s not a windows issue (lately i thought to that possibility, too )
  • help & support, that 3.4.1 will not flag as ā€œstable versionsā€ into the client agents :nerd_face: - so it should be :dizzy:

sadly no, also with earlier releases. It seems the bulk-upload additions have triggered something bad, even though the bulk upload is only active in Nc 23.

thanks, thatā€™s good to know!

Could anybody that had the issue before try the 3.4.1RC perhaps?

I also notified everyone on 3.4.0 keep synchronizing files after upgrade Ā· Issue #4016 Ā· nextcloud/desktop Ā· GitHub and had a positive feedback

@jospoortvliet I had the issue before on Win 10 + NC 22.2.3 (Docker) and Iā€™m willing to test but it would be great if you show some recovery steps without restoring the whole instance - just in case it is not fixed and happens again :sweat_smile: - last time the restore took me 3 hoursā€¦

Maybe you can share some manual steps to find out affected files and restore valid versions using some commandsā€¦ as I asked in

I sadly donā€™t really know how to fix these things, Iā€™ve asked one of the desktop client developers who incidentally also has sysadmin experience to pitch in.

the nature of the sync makes it hard to predict what happens and whenā€¦ desktop clients are little unpredictable on what and when happens (long scanning phase then fast uploading) and Iā€™m little discouraged to run the test again with possible ā€œbenefitā€ of another 3 hours restoring my instanceā€¦

We know the problem occurs somehowā€¦ you suppose to have valid fix related to batch upload featureā€¦ but from my understanding it doesnā€™t really matter how the client uploads the files, one by one or in batches there is absolutely no reason to touch the ctime/mtime of the filesā€¦ and if you are not 110% confident the fix works itā€™s better to spend another day to think about recovery strategyā€¦ velocity is not always the key - especially in such situation when bug has huge impact on the whole instance keep calm, step back and deliver first time right and avoid braking same system twiceā€¦

In my eyes itā€™s worth going the extra mile and start with intentionally brake an installation with 3.4.0 over additional recovery step (to verify it works) rather just going fast forward to the ā€œfixedā€ state - especially if you think about the customers who still receive buggy 3.4.0 update through different channels and may hit the problemā€¦ or others who might not have recognized it nowā€¦ we have a chance to give them a helping hand to recover from an issue rather telling them - ā€œitā€™s you business to have good backup strategy and this is a good chance to test the recoveryā€ :face_vomiting: you can count on me if you choose this wayā€¦

Perhaps you can create a specific new user and run the tests for this user first. Not sure about the reproduction step, if you need some data first, etc. But if it is a new user with some random data, you can clean that afterwards more easily. Ideally, you test with the 3.4.0 version first, if you have the error and that for the new user it is gone in 3.4.1.

I absolutely agree and Iā€™m aware of the possible steps to repro the issueā€¦ one could start with a virtual machine and NC instance without production files, check if the problem exists, go a step further to a fake user on your production instance with your production client and finally end with your production user and production data:

each additional step you add to improve security and avoid data loss cost you time and resources and if the problem strikes at very last stage when you trust the update and test on your production data you end on in the same situation - hours of recovery work - additionally to all the testing beforeā€¦

chances exist you find the issue running all the dry tests with fake user/fake data before you touch the production data - but the question who is willing to spend days of his spare time (and at same time is well trained to understand and document the tests in a good way)ā€¦

and Nextcloud could run all this dry run tests on their own (hopefully they do)ā€¦ Nextcloud already has to run this tests for their paying customersā€¦ I feel little unfair they offload this time-consuming, dirty work to the community, without providing good support and recovery optionsā€¦

In the other discussion, about another legal entity, Jos and others always talked about money and resources - this is exactly the right discussion - Iā€™m glad to give away a portion of my time and expertise for the community but Iā€™m not willing to work as full time test engineer without reasonable returnā€¦

@jospoortvliet Iā€™m ready to talk about a deal - I spent my time for comprehensive testing in exchange to ā€œcreditsā€ I can choose features/bugs/improvements I would like to push from my side. We definitely can negotiate good quota of my time vs. NC timeā€¦

You asked how to avoid to spend 3h of restoring data. And just taking a fake user on your productive setup (and perhaps your local system), does it take 30 min? Not hours. You are not supposed to do full testing and you donā€™t need to set up x virtual machines. You just want an idea how your productive user might react with a fair chance.

What about the people spending their time helping other users, working on the documentation, translation etc.?

For certain bugs there is a monetary compensation if you report them via hackerone.