Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Desktop Client freezes in a 100% CPU load loop after server upgrade to 25 #5094

Closed
5 of 8 tasks
RainerKlute opened this issue Oct 26, 2022 · 124 comments · Fixed by #5680
Closed
5 of 8 tasks

[Bug]: Desktop Client freezes in a 100% CPU load loop after server upgrade to 25 #5094

RainerKlute opened this issue Oct 26, 2022 · 124 comments · Fixed by #5680

Comments

@RainerKlute
Copy link

⚠️ Before submitting, please verify the following: ⚠️

Bug description

After upgrading the server to NC 25 stable (25.0.0.18, to be exact), the desktop client (3.6.1 on openSUSE Tumbleweed) does no longer sync from the client to the server. Instead, the client totally freezes, while using 100% CPU load.

It turned out that the bulk upload feature is in error. As a workaround, I turned off bulk upload in the server’s config.php file by adding the following line:

'bulkupload.enabled' => false,

Steps to reproduce

See the description above.

Expected behavior

The client should never freeze, even if the server is behaving incorrectly. Ideally, the client should support bulk upload. As the very minimum – for example, if the origin error is on the server side –, it should fall back to normal upload.

Which files are affected by this bug

Operating system

Linux

Which version of the operating system you are running.

openSUSE Tumbleweed

Package

Distro package manager

Nextcloud Server version

25.0.0.18

Nextcloud Desktop Client version

3.6.1

Is this bug present after an update or on a fresh install?

Fresh desktop client install

Are you using the Nextcloud Server Encryption module?

Encryption is Disabled

Are you using an external user-backend?

  • Default internal user-backend
  • LDAP/ Active Directory
  • SSO - SAML
  • Other

Nextcloud Server logs

{
   "app" : "webdav",
   "exception" : {
      "Code" : 500,
      "CustomMessage" : "Unknown error while seeking content",
      "Exception" : "Sabre\\DAV\\Exception",
      "File" : "/data/nextcloud/apps/dav/lib/BulkUpload/MultipartRequestParser.php",
      "Line" : 111,
      "Message" : "Unknown error while seeking content",
      "Trace" : [
         {
            "class" : "OCA\\DAV\\BulkUpload\\MultipartRequestParser",
            "file" : "/data/nextcloud/apps/dav/lib/BulkUpload/MultipartRequestParser.php",
            "function" : "isAt",
            "line" : 129,
            "type" : "->"
         },
         {
            "class" : "OCA\\DAV\\BulkUpload\\MultipartRequestParser",
            "file" : "/data/nextcloud/apps/dav/lib/BulkUpload/BulkUploadPlugin.php",
            "function" : "isAtLastBoundary",
            "line" : 71,
            "type" : "->"
         },
         {
            "class" : "OCA\\DAV\\BulkUpload\\BulkUploadPlugin",
            "file" : "/data/nextcloud/3rdparty/sabre/event/lib/WildcardEmitterTrait.php",
            "function" : "httpPost",
            "line" : 89,
            "type" : "->"
         },
         {
            "class" : "Sabre\\DAV\\Server",
            "file" : "/data/nextcloud/3rdparty/sabre/dav/lib/DAV/Server.php",
            "function" : "emit",
            "line" : 472,
            "type" : "->"
         },
         {
            "class" : "Sabre\\DAV\\Server",
            "file" : "/data/nextcloud/3rdparty/sabre/dav/lib/DAV/Server.php",
            "function" : "invokeMethod",
            "line" : 253,
            "type" : "->"
         },
         {
            "class" : "Sabre\\DAV\\Server",
            "file" : "/data/nextcloud/3rdparty/sabre/dav/lib/DAV/Server.php",
            "function" : "start",
            "line" : 321,
            "type" : "->"
         },
         {
            "class" : "Sabre\\DAV\\Server",
            "file" : "/data/nextcloud/apps/dav/lib/Server.php",
            "function" : "exec",
            "line" : 360,
            "type" : "->"
         },
         {
            "class" : "OCA\\DAV\\Server",
            "file" : "/data/nextcloud/apps/dav/appinfo/v2/remote.php",
            "function" : "exec",
            "line" : 35,
            "type" : "->"
         },
         {
            "args" : [
               "/data/nextcloud/apps/dav/appinfo/v2/remote.php"
            ],
            "file" : "/data/nextcloud/remote.php",
            "function" : "require_once",
            "line" : 167
         }
      ],
      "exception" : {},
      "message" : "Unknown error while seeking content"
   },
   "level" : 3,
   "message" : "Unknown error while seeking content",
   "method" : "POST",
   "remoteAddr" : "*****",
   "reqId" : "Qrxj5bFNAAPoYgTYGy9B",
   "time" : "2022-10-25T13:43:09+00:00",
   "url" : "/cloud/remote.php/dav/bulk",
   "user" : "*****",
   "userAgent" : "Mozilla/5.0 (Linux) mirall/3.6.1git (Nextcloud, opensuse-tumbleweed-6.0.2-1-default ClientArchitecture: x86_64 OsArchitecture: x86_64)",
   "version" : "25.0.0.18"
}

Additional info

Under “Is this bug present after an update or on a fresh install?” I had to select something, although I did not update the client.

@appliedbarrel
Copy link

Looks like the same issue: #4106
I also just had it crop up again too

@mgallien
Copy link
Collaborator

@RainerKlute would you be able to share your logs ?
The error you report while in need of a fix should not cause such a big use of CPU.

@EVOTk
Copy link

EVOTk commented Oct 27, 2022

Hello,
Thanks for creating this ticket, I have the same problem but with a cpu load of about 15/16%.

The addition of "'bulkupload.enabled' => false," in config.php solved my problem.

image

like this: #5031

Server :
Nextcloud : 25.0.0
Nginx / Php 8

Client :
Windows 10 - 21H1
Desktop : 3.6.1 ( same in 3.6.0 )

#########################
Edit :

After reading this message: #4106 (comment)

I disabled the automatic upload speed limitation option, and the client not freeze

@PVince81
Copy link
Member

PVince81 commented Oct 31, 2022

I was able to sync a lot of files these days with desktop client 3.6.1 on openSUSE Tumbleweed 20221003 and Nextcloud 25.0.1 RC1.

However, after having synced a lot of files it's now stuck at 100%.
I see the UI saying it's going to sync about 111 kb of files.
The log only says this:

#=#=#=# Syncrun started 2022-10-31T21:55:05Z
#=#=#=#=# Propagation starts 2022-10-31T21:55:07Z (last step: 1771 msec, total: 1771 msec)

the nextcloud.log has this message:

{
  "reqId": "Y2BEO2rciF3muyD7udwG9AAAAAA",
  "level": 3,
  "time": "2022-10-31T21:55:17+00:00",
  "remoteAddr": "x.x.x.x",
  "user": "vincent",
  "app": "webdav",
  "method": "POST",
  "url": "/remote.php/dav/bulk",
  "message": "Unknown error while seeking content",
  "userAgent": "Mozilla/5.0 (Linux) mirall/3.6.1git (Nextcloud, opensuse-tumbleweed-5.19.13-1-default ClientArchitecture: x86_64 OsArchitecture: x86_64)",
  "version": "25.0.1.0",
  "exception": {
    "Exception": "Sabre\\DAV\\Exception",
    "Message": "Unknown error while seeking content",
    "Code": 500,
    "Trace": [
      {
        "file": "/var/www/html/nextcloud/apps/dav/lib/BulkUpload/MultipartRequestParser.php",
        "line": 129,
        "function": "isAt",
        "class": "OCA\\DAV\\BulkUpload\\MultipartRequestParser",
        "type": "->"
      },
      {
        "file": "/var/www/html/nextcloud/apps/dav/lib/BulkUpload/BulkUploadPlugin.php",
        "line": 71,
        "function": "isAtLastBoundary",
        "class": "OCA\\DAV\\BulkUpload\\MultipartRequestParser",
        "type": "->"
      },
      {
        "file": "/var/www/html/nextcloud/3rdparty/sabre/event/lib/WildcardEmitterTrait.php",
        "line": 89,
        "function": "httpPost",
        "class": "OCA\\DAV\\BulkUpload\\BulkUploadPlugin",
        "type": "->"
      },
      {
        "file": "/var/www/html/nextcloud/3rdparty/sabre/dav/lib/DAV/Server.php",
        "line": 472,
        "function": "emit",
        "class": "Sabre\\DAV\\Server",
        "type": "->"
      },
      {
        "file": "/var/www/html/nextcloud/3rdparty/sabre/dav/lib/DAV/Server.php",
        "line": 253,
        "function": "invokeMethod",
        "class": "Sabre\\DAV\\Server",
        "type": "->"
      },
      {
        "file": "/var/www/html/nextcloud/3rdparty/sabre/dav/lib/DAV/Server.php",
        "line": 321,
        "function": "start",
        "class": "Sabre\\DAV\\Server",
        "type": "->"
      },
      {
        "file": "/var/www/html/nextcloud/apps/dav/lib/Server.php",
        "line": 360,
        "function": "exec",
        "class": "Sabre\\DAV\\Server",
        "type": "->"
      },
      {
        "file": "/var/www/html/nextcloud/apps/dav/appinfo/v2/remote.php",
        "line": 35,
        "function": "exec",
        "class": "OCA\\DAV\\Server",
        "type": "->"
      },
      {
        "file": "/var/www/html/nextcloud/remote.php",
        "line": 171,
        "args": [
          "/var/www/html/nextcloud/apps/dav/appinfo/v2/remote.php"
        ],
        "function": "require_once"
      }
    ],
    "File": "/var/www/html/nextcloud/apps/dav/lib/BulkUpload/MultipartRequestParser.php",
    "Line": 111,
    "message": "Unknown error while seeking content",
    "exception": {},
    "CustomMessage": "Unknown error while seeking content"
  }
}

Basically: when I start the desktop client again, first I see the above message and then suddenly it's 100% CPU on the client side.
Maybe it's not dealing with this error correctly ?
Also unsure why there would be such error on the server side ?

Let me know where I should look if there's a way to provide more information.

@artonge @mgallien

@PVince81
Copy link
Member

I've just tried to make the server return a clean 400 Bad Request but the client still stucks at 100% CPU, so perhaps the error handling there has some issue or infinite loop of sorts.

The server side error seems to be about some problem in the multi part request.

Ah, and I just noticed that today I enabled the upload speed limitation and that it was reported to cause trouble also in the original report. @mgallien

@PVince81
Copy link
Member

I've restarted the client, paused the sync, then removed the upload and download speed limit, then resumed sync: no more issues

maybe there's something odd with the combination of bulk upload and upload speed limit
I hope the speed limit implementation itself isn't somehow truncating multipart requests ?

@artonge
Copy link

artonge commented Nov 2, 2022

The server is failing when trying to seek the stream backward. Two possibilities:

  1. Somehow, the stream content is not accessible anymore, not sure why this could happen.
  2. The fread call does not return false when reading further than EOF, but the fseek call return -1 when seeking back to an inaccessible offset.

@PVince81
Copy link
Member

PVince81 commented Nov 2, 2022

from my experience, doing an fseek backwards is often not a good idea as it can break on streams (not files)

@Nowaker
Copy link

Nowaker commented Nov 3, 2022

Same thing here. Using Mac OS client.

@The13thTimelord
Copy link

The13thTimelord commented Nov 6, 2022

Same, CPU & RAM usage are really high and I just get
"Network Error: 99" aswell

EDIT - I'm using Windows Desktop Client

@mgallien mgallien added the approved bug approved by the team label Nov 7, 2022
@mgallien mgallien self-assigned this Nov 7, 2022
@Devonavar
Copy link

I'm also affected after upgrading to Nextcloud 25. Can confirm that disabling "upload limiting" in the desktop client makes this bug go away (but it then hoses my upload bandwidth to the point of making downloading impossible whenever I have a large upload).

NextCloud 25.0.1
Desktop Client Version 3.5.4 (Ubuntu)
Ubuntu 20.04

@Der-K-2000
Copy link

Any updates to this? Even in v3.6.2 the errors occur.

@avatar1024
Copy link

I can reproduce this bug and can confirm that removing the speed limit solves it. It is the combo of bulk upload + speed limits that cause this issue but it "just" looks like a bug in the client and server side the files had synchronised properly, still the client hangs with high cpu use.

@Baeda73
Copy link

Baeda73 commented Nov 15, 2022

I'm also affected after upgrading to Nextcloud 25. Can confirm that disabling "upload limiting" in the desktop client makes this bug go away (but it then hoses my upload bandwidth to the point of making downloading impossible whenever I have a large upload).

NextCloud 25.0.1 Desktop Client Version 3.5.4 (Ubuntu) Ubuntu 20.04

solves the issue for me !

@mayonezo
Copy link
Contributor

I can confirm this issue and the solution of setting the Bandwidths to "No limit" on Debian Bullseye.

@pchesneau
Copy link

Hello!
I confirm the issue and the solution (disable bandwith limit") on Windows 10 21H2 with nextcloud client 3.6.2 and nextcloud 25.0.1.

@mrothauer
Copy link

Hello! I confirm the issue and the solution (disable bandwith limit") on Windows 10 21H2 with nextcloud client 3.6.2 and nextcloud 25.0.1.

same here - solved the issue for me

@zeus86
Copy link

zeus86 commented Nov 22, 2022

can confirm the issue on ubuntu 20.04 with client 3.5.4 (x86_64), when bw-limits are in place.
I can also confirm, that it is server-dependant. i have 2 NC-Servers hooked up to my client. one with NC25, one with NC23. Neither of those have anything configured regarding bulk-uploads, but the Client only locks up, when a file would be synced to the NC25-Server (just a plain textfile of a few kB, nothing that would've been complicated to seek).

PS: because this box is unchecked in the OP: the NC25-Server uses LDAP as it's userbase.

@hussong
Copy link

hussong commented Nov 23, 2022

I can confirm the issues on Ubuntu Mate 22.04 with the default client from the sources as well as the most recent version (3.6.2) from ppa:nextcloud-devs/client. Issues started happening once the server was updated to 25.0.1.

Disabling the bandwidth limits made the client sync again.

At first, I couldn't access the settings in the GUI, because it would freeze right after (re-)starting the client. So, to get to the settings, I edited the config file .config/Nextcloud/nextcloud.cfg where I set 0\Folders\1\paused=false to 0\Folders\1\paused=true. With the client launching in paused mode, I could access settings and disable the bandwidth limit in the GUI. Then I hit play and it synced fine.

@zeus86
Copy link

zeus86 commented Nov 23, 2022

I can confirm the issues on Ubuntu Mate 22.04 with the default client from the sources as well as the most recent version (3.6.2) from ppa:nextcloud-devs/client. Issues started happening once the server was updated to 25.0.1.

Disabling the bandwidth limits made the client sync again.

At first, I couldn't access the settings in the GUI, because it would freeze right after (re-)starting the client. So, to get to the settings, I edited the config file .config/Nextcloud/nextcloud.cfg where I set 0\Folders\1\paused=false to 0\Folders\1\paused=true. With the client launching in paused mode, I could access settings and disable the bandwidth limit in the GUI. Then I hit play and it synced fine.

That's how i solved it, too.

@pierr0t
Copy link

pierr0t commented Nov 24, 2022

Hello,
I see the same problem on MacOS Ventura with client 3.6.2 accessing a v25.0.1 server and the 'bulkupload.enabled' => false, solution seems to fix it, at least for the last 10 min (client has been stuck for the last 24h).
For ref: https://help.nextcloud.com/t/macos13-1-ventura-ncclient-3-6-2-hang-does-not-respond-no-sync/150338/7
Pierre

@PhilippSchlesinger
Copy link

@rstockm
The client version you use is nearly 1 1/2 years old, the recent version is 3.13.0.
You might want to check with this version that contains the fix mentioned in #5094 (comment)

@alerque
Copy link

alerque commented Jun 11, 2024

@PhilippSchlesinger I don't know why @rstockm is running such an old client but this issue is still happening in 3.13.0. The fix mentioned as being in 3.9.0 never did work. Many people have confirmed the issue on versions since then.

@farvardin
Copy link

@PhilippSchlesinger I don't know why @rstockm is running such an old client but this issue is still happening in 3.13.0. The fix mentioned as being in 3.9.0 never did work. Many people have confirmed the issue on versions since then.

I can confirm on Linux the 3.13.0 version is buggy and causes my computer to hang with 100% cpu load. It works fine with the 3.9.0 appimage

#6749 (comment)

@kupietools
Copy link

kupietools commented Jul 25, 2024

@PhilippSchlesinger I don't know why @rstockm is running such an old client but this issue is still happening in 3.13.0. The fix mentioned as being in 3.9.0 never did work. Many people have confirmed the issue on versions since then.

I can confirm on Linux the 3.13.0 version is buggy and causes my computer to hang with 100% cpu load. It works fine with the 3.9.0 appimage

#6749 (comment)

I have tried 3.3.4, 3.9, and 3.13.2 on Mac. They all have this showstopper bug. As near as I can determine, is no working NextCloud client for Mac, and nobody wants to admit it.

Is this project abandoned? Why are issues going months without even being acknowledged?

@alerque
Copy link

alerque commented Jul 25, 2024

@kupietools You're being a bit rediculious, obviously with an ongoing string of releases this project is not abandoned. Some issues are just a bit miss-managed. For whatever reason nobody wants to own up to there being a bug here and just re-open the issue. Also if you actually read the comments before you you'll notice it is not a Mac issue and there is a work around.

Meanwhile you can use any client version you like just fine but you have to get your Nextcloud cloud instance admin (maybe that's yourself?) to add 'bulkupload.enabled' => false to the server's config.php.

@ebastler
Copy link

ebastler commented Jul 25, 2024

add 'bulkupload.enabled' => false to the server's config.php.

Sadly that leads to a huge performance loss. I'm often syncing a bunch of tiny data around (target folder for scientific simulations with a lot of small results files) and even when synced directly over GigE in the same network, the workaround made syncing lots of small files very tedious.

How such a critical bug can remain unaddressed for now almost 2 years since the first bug report is beyond me. If it is very difficult to fix, and needs time, so be it. But we did not even get a "yeah, this is a bug, we know about it and work on it" message so far. Just some random closing of the issue report while the actual issue persisted just the same.

@TheThief
Copy link

Yes, the workaround is very much suboptimal.

@alerque
Copy link

alerque commented Jul 25, 2024

Yes, the work around is very much less than sub-optimal. But for most folks slow is better than 100% CPU usage and no functionality. I don't know how this hasn't hit any developers such that they would be motivated to actually chase it down, and I don't know enough about debugging the desktop app to track down the problem myself.

@RainerKlute
Copy link
Author

The workaround was useful while it was needed. I reverted my Nextcloud instance back to 'bulkupload.enabled' => true a year ago. So the bugs seems to be fixed, at least sort of. However, there seem to be some loose ends, affecting some people. :-(

@ebastler
Copy link

I don't know enough about debugging the desktop app to track down the problem myself.

Same. I'd help if I could, but I really lack the skills for that, sadly.

@alerque
Copy link

alerque commented Jul 25, 2024

@RainerKlute Clearly not everybody is affected (or developers would have given it more attention long ago) but also whatever the trigger is many people are still affected. I'm glad you are not, but that just suggests whatever the trigger scenario is is not being hit by you, not that the bug is fixed. Also what clients are you using? Besides the server-side work around the client can also be compiled without bulk upload support, and several distros do this. For example the Arch Linux has the feature patched out so that nobody on Arch Linux is likely to ever see this issue.

I've thought of re-enabling it to put pressure back on devs to find this problem, but I also don't want to deal with a bunch of bug reports and hassle for all Arch Linux users that would be negatively affected.

@ebastler
Copy link

Oh, now that explains why I only encountered it on Windows, not on Arch...

@alerque
Copy link

alerque commented Jul 25, 2024

The nextcloud desktop client app on Arch Linux has this monkey patch that changes the code that detects never capabilities to always report false for bulk upload. This makes it slower than necessary but keeps the client from freezing up. I've tried disabling it several times and not only my own systems but other reports of problems start flooding in.

@RainerKlute
Copy link
Author

Also what clients are you using? Besides the server-side work around the client can also be compiled without bulk upload support, and several distros do this.

I am using the Nextcloud client that comes with openSUSE Tumbleweed. Don’t know whether bulk upload is inhibited there or how to find out. The server is also running openSUSE Tumbleweed.

@alerque
Copy link

alerque commented Jul 25, 2024

I just reviewed the sources for OpenSUSE Tumbleweed's Nextcloud server and desktop packages and it does not look like they have patched out bulk upload support on either side. Thanks an interesting data point but doesn't isolate what the unique factors are, and we know multiple host platforms and all of Windows/macOS/Linux clients are affected in some cases.

Do you by chance use any of the rate limiting features on the client? Those seem to be another way people work around triggering this bug.

@kupietools
Copy link

kupietools commented Jul 25, 2024

Thanks for responding, all.

Unfortunately my server is a Hetzner Storage Share, they've already passed the buck and told me that while I can continue to pay them money, this is not their problem and they'll do nothing to help, and it's entirely up to me to convince the NextCloud devs to address the problem. However, I'll ask them if setting 'bulkupload.enabled' => false is an option.

I did discover that the OwnCloud client works and appears not to freeze. However, while NextCloud told me (between freezes) to expect about a four-day wait to upload 400GB of files, OwnCloud wavered between 3-9 months for the same files, so this doesn't seem like a workable solution. I'd rather not wait the better part of a year for my first sync to finish.

Do you by chance use any of the rate limiting features on the client? Those seem to be another way people work around triggering this bug.

I've tried rate limiting both auto and off for download, for both NextCloud and OwnCloud clients. I am reluctant to turn rate limiting off for upload as this can easily slow up the machine's entire connection due to the back-and-forth packets needed just for downloading, but I'll give it a shot. I did try turning the upload limiting off in OwnCloud to see if it made a difference in the speed, and it didn't seem to.

UPDATE: I let it run with the upload and download both not rate-limited, and it froze again inside of a few hours.

@pelzvieh
Copy link

Obviously, we're discussing something here that is different to the original bug reported, which is not the most effective thing to do.

My installation has been affected by the problem, and to cure that problem as originally reported it has been - and is - sufficient to switch Download and Upload bandwidth limit to "no limit" at the client. Though I'm not a fan of the way it was fixed, certainly this problem is fixed.

As you are experiencing problems with clients freezing with 100% CPU load, obviously it is a different problem and deserves a different report.

@TheThief
Copy link

Obviously, we're discussing something here that is different to the original bug reported, which is not the most effective thing to do.

My installation has been affected by the problem, and to cure that problem as originally reported it has been - and is - sufficient to switch Download and Upload bandwidth limit to "no limit" at the client. Though I'm not a fan of the way it was fixed, certainly this problem is fixed.

As you are experiencing problems with clients freezing with 100% CPU load, obviously it is a different problem and deserves a different report.

I never used the rate limiting, the only thing that worked ever since it first happened has been to disable bulk upload on the server. It's definitely the same issue - bulk upload being enabled results in clients freezing while using 100% CPU. That's the same as the original.

@unterkomplex
Copy link

Thanks for responding, all.

Unfortunately my server is a Hetzner Storage Share, they've already passed the buck and told me that while I can continue to pay them money, this is not their problem and they'll do nothing to help, and it's entirely up to me to convince the NextCloud devs to address the problem. However, I'll ask them if setting 'bulkupload.enabled' => false is an option.

I did discover that the OwnCloud client works and appears not to freeze. However, while NextCloud told me (between freezes) to expect about a four-day wait to upload 400GB of files, OwnCloud wavered between 3-9 months for the same files, so this doesn't seem like a workable solution. I'd rather not wait the better part of a year for my first sync to finish.

Do you by chance use any of the rate limiting features on the client? Those seem to be another way people work around triggering this bug.

I've tried rate limiting both auto and off for download, for both NextCloud and OwnCloud clients. I am reluctant to turn rate limiting off for upload as this can easily slow up the machine's entire connection due to the back-and-forth packets needed just for downloading, but I'll give it a shot. I did try turning the upload limiting off in OwnCloud to see if it made a difference in the speed, and it didn't seem to.

UPDATE: I let it run with the upload and download both not rate-limited, and it froze again inside of a few hours.

This won't fix any bugs in any way, but if you just want to run an initial sync of your data this is what worked for me with a Hetzner Share:

  • Use rclone to access the WebDAV interface of Nextcloud ( https://rclone.org/webdav/ )
  • rclone sync your local directory to remote

@kupietools
Copy link

kupietools commented Jul 29, 2024

This won't fix any bugs in any way, but if you just want to run an initial sync of your data this is what worked for me with a Hetzner Share:

* Use rclone  to access the WebDAV interface of Nextcloud ( https://rclone.org/webdav/ )

* rclone sync your local directory to remote

Thanks for the suggestion! I decided to pay for Hetzner's NextCloud hosting because I did look at just paying for cloud storage and using WebDAV to sync, and it wasn't going to work for my purposes. It costs 50% more than their bare-bones cloud storage, I would have just gone for that if that was going to work for me.

@kupietools
Copy link

@unterkomplex So, just to follow up, I did end up trying rclone, just to see if maybe, once one sync was done, maybe the owncloud client's slowness wouldn't be an issue in keeping everything up to date.

The results:

1.) Through several days of crashes and manual restarts, the NextCloud client got me about 75% synced. That had taken a few days, including a lot of downtime when I didn't realize it was crashed. Last I looked, Rclone estimated it would take over a month to complete the final 25%.

2.) I tried anyway and gave it about a week. Twice in that time, I looked and the command-line rclone client had aborted due to I/O errors. (BTW I'll spare you an exhaustive description of my network, but any configuration or connectivity problems on my computer or network would cause way more things going wrong than just one single app reporting errors and nothing else.) This isn't usable.

I just canceled my Hetzner Storage Share account. I won't using NextCloud, since in several weeks of trying it hasn't worked for me at all and I haven't been able to find any sign at all that the devs have any interest in fixing the bugs.

Thank you so much to the several people here who did reply here—I really appreciate your efforts to be helpful!

@farvardin
Copy link

farvardin commented Aug 5, 2024

@kupietools have you tried using an older nc client? The bug ceased to affect my server and computer whenever i was using an old client.
Nextcloud is pretty reliable and very useful for my usage (10-15 gb). But i wouldn't use it for so many data, 400 gb is overkill for continous synchronisation, better use a hot/cold storage differentiation.

@lightlike
Copy link

lightlike commented Aug 5, 2024

@kupietools if you want to sync data and do not really care about a fast response time and you just want to keep the data up to date after the initial sync, then you should try looking at https://github.com/syncthing/syncthing.

I am currently using that as a backup solution for my rasberry pies to my storage server.
I also successfully used it to keep 2 NFS servers in sync at an old job with minimal missing data when the server would crash.
It would also start right back up after the crashed server would restart.

It might be a bit slow to start the sync but should handle your crashes fine. It will just wait until the connection is back.
You can also create a mesh of multiple devices and you have full control over the way everything is synced.

It is also very lightweight.

@mowny
Copy link

mowny commented Sep 3, 2024

This won't fix any bugs in any way, but if you just want to run an initial sync of your data this is what worked for me with a Hetzner Share:

  • Use rclone to access the WebDAV interface of Nextcloud ( https://rclone.org/webdav/ )
  • rclone sync your local directory to remote

We are migrating from pCloud to hetzner nextcloud because the pCloud client also sucks and filled our 100MBit uplink for 3 weeks, continuously resyncing the same files and impeding our other applications. We already have the full cloud (128G, 239255 files) mirrored here on the linux server.

There still is a kind of initial sync needed because nextcloudcmd needs to build the database which is used to determine whether files missing on one side should be resynced or deleted, even if it doesn't need to transfer any files because they're already there.

The first runs aborted because they ran into the ulimit -n of 1024 files.
After increasing the limit to 65536, the sync seems to run through the whole list (assuming it is sorted like ls does), but keeps failing the exact same files at the end with OCC::SyncFileItem::NormalError CSyncEnums::CSYNC_INSTRUCTION_NEW "Network error: 99". As it also takes several hours for each run, continuously at 100% CPU, it seems that the sync never considers any files as finished and reexamines them each run, which effectively keeps it from making any progress.

In conclusion, the cli²ent has several issues:

  • either extremely verbose output or no output at all, there is no middle ground between what feels like -vvv and -s
  • keeping far too many files open for no real reason
  • inadequate error handling: After running into EMFILE, a program should not just abort but instead close unnecessary files and try again. Even if it does abort, progress should be saved so tried files do not need to be tried again.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet