Migrating from S3 primary storage to local storage

Hello,

I am currently running Nextcloud in AWS EC2 with S3 as primary storage. I’d like to move it to a home server using local storage, primarily to save on cost.

What’s the best approach to migrate the data?

I’ve tried to sync the content of the S3 bucket to local storage using the aws s3 sync command, but apparently Nextcloud doesn’t organize bucket files the same way as the local storage.

% ls -ltr /data/nextcloud|tail
-rw-r--r--  1 www  www     139394 Apr 12 14:08 urn:oid:177499
-rw-r--r--  1 www  www        551 Apr 12 14:08 urn:oid:177506
-rw-r--r--  1 www  www        191 Apr 12 14:08 urn:oid:177505

Thanks!
Paul

Nextcloud version (eg, 12.0.2): 18.0.3
Operating system and version (eg, Ubuntu 17.04): Debian 10 (Buster)
Apache or nginx version (eg, Apache 2.4.25): Apache 2.4.38
PHP version (eg, 7.1): 7.3

The output of your config.php file in /path/to/nextcloud (make sure you remove any identifiable information!):

<?php
$CONFIG = array (
  'instanceid' => '_redacted_',
  'passwordsalt' => '_redacted_',
  'secret' => '_redacted_',
  'trusted_domains' => 
  array (
    0 => '_redacted_',
  ),
  'datadirectory' => '/var/www/nextcloud/data',
  'overwrite.cli.url' => '_redacted_',
  'dbtype' => 'pgsql',
  'version' => '18.0.3.0',
  'dbname' => '_redacted_a',
  'dbhost' => '_redacted_',
  'dbport' => '',
  'dbtableprefix' => 'oc_',
  'dbuser' => '_redacted_',
  'dbpassword' => '_redacted_',
  'installed' => true,
  'ldapIgnoreNamingRules' => false,
  'ldapProviderFactory' => '\\OCA\\User_LDAP\\LDAPProviderFactory',
  'log_type' => 'owncloud',
  'logfile' => '/var/log/nextcloud.log',
  'loglevel' => 0,
  'objectstore' => 
  array (
    'class' => 'OC\\Files\\ObjectStore\\S3',
    'arguments' => 
    array (
      'bucket' => '_redacted_',
      'autocreate' => true,
      'key' => '_redacted_',
      'secret' => '_redacted_',
      'region' => 'eu-west-1',
      'use_path_style' => false,
    ),
  ),
  'maintenance' => false,
  'theme' => '',
  'memcache.local' => '\\OC\\Memcache\\APCu',
  'memcache.distributed' => '\\OC\\Memcache\\Redis',
  'memcache.locking' => '\\OC\\Memcache\\Redis',
  'redis' => 
  array (
    'host' => '/var/run/redis/redis.sock',
    'port' => 0,
  ),
  'mail_smtpmode' => 'smtp',
  'mail_smtphost' => '_redacted_',
  'htaccess.RewriteBase' => '/nextcloud',
);

Did you consider to sync the files via Nextcloud client?
Another way would be via webdav. Either mount via webdavfs or use rclone.org.

I am considering using a fresh install and syncing files with the Nextcloud client indeed. But that’s a worst case options as I have several users (family installation) and they would all have to do this operation themselves.

What would be the webdav options? Mount the source Nextcloud using webdav?

than have a look at rclone.org

I’ve already done something similar to rclone, to move data from the s3 bucket to the local storage.

The problem is that files in S3 are named like this “urn:oid:177499” which doesn’t seem to be what Nextcloud expects for local storage file names.

After copying all the files from the S3 bucket to the local storage, Nextcloud doesn’t detect them.

I’ve seen some ways to migrate from local storage to S3 which involves renaming files to match their database index. I believe that here I would have to perform the inverse operation?

of course. if you use rclone to connect to aws s3.

I was thinking about connecting to the nextcloud webdav interface.

Did you find any reliable solution here? Migrate the whole data for all users from S3 to local storage?

Not so many use the primary external storage here. If you get official support, they can probably help you. If I didn’t want to use the support, I’d try to figure out the structure of the storage, if you can easily change it in the database. I’d do this with a small test setup.
if it works, also try with a setup where sync clients are connected. If you think, you got it, be extremely careful, backup everything.
If it doesn’t work, you can manually copy, or set up a new instance, use federated sharing and migrate slowly over to the new version.

Some script that would take database dump + directory with downloaded entire bucket and then restructure everything properly in different directory while discarding for example thumbnails would be really useful here. Maybe it could even connect to S3/Swift and download only actual files while restructuring instead of thumbnails and other things like this. Unfortunately I’m too dumb for this :<

Like this? :slight_smile:

Kinda, but in reverse :smile:

I already did both, I migrated from LOCAL to S3 and I regretted many synchronization problems with the desktop client, files with 0B, not to mention the high costs of AWS, as they charge for outgoing traffic. I did the opposite, I moved from S3 to LOCAL, I confess it was not easy, it involves connecting to the webdav with Rclone and giving some updates on the DB.

I will prepare what I did and post it here.

OBS: forgiveness for my english

2 Likes

I once made a PHP script for migration from S3 to Local. Maybe it is helpful for some of you: GitHub - lukasmu/nextcloud-s3-to-disk-migration: Script for migrating Nextcloud primary storage from S3 to local storage

2 Likes

Hi, does this take everything in consideration ? Files, group folders, share links, etc ?

Has anyone managed to do this successfully who is able to share their process, along with obstacles/solutions? I’m looking to make the same move now that my hosting provider offers faster Block Storage volumes.

I appreciate @lukasm sharing the script for automating a portion of this process but I’m hesitant to try this and create issues with the data, as I’m not on the exact same version and don’t know what exactly to expect in the event of a failure (I do have backups, but would rather not risk a major loss).

Thank you so much for your script @lukasm. I migrated from S3 to local storage on my little instance (few users and data). It rocks ! :smile: