Using Multiple ZFS Datasets

I am setting up a Docker based home file-/media-server running on Ubuntu with services such as NextCloud, Samba, Plex, Calibre, etc.

I am using a ZFS pool for the storage an would like to separate the different content with a dataset hierarchy, as shown in the screenshot, as the different types of data has different importance (an as such will have different backup schemes) and will be used / accessible from different services.

For instance; users in NextCloud will have their own personal folders on a dataset, but should also have common access (in NextCloud) to media and library datasets/folders to upload content used by other services such as Plex.

My question is; how will this work with NextCloud? From what I am reading it seems like using anything other than a common dataset/location for the NextCloud data is setup for trouble, or is there a way to achieve this without risking the data, synch issues, corruption, messing up the NC database etc.?

It works like a charm.

NAME                    USED  AVAIL  REFER  MOUNTPOINT
dsk                    1.33T  2.19T    96K  /mnt/dsk
dsk/backups            39.1G  2.19T  39.1G  /mnt/dsk/backups
dsk/nextcloud          25.3G  2.19T  21.6G  /mnt/dsk/nextcloud
dsk/nextcloud/preview  2.80G  2.19T  2.80G  /mnt/dsk/nextcloud/appdata_xxx/preview
dsk/services            336M  2.19T   320M  /mnt/dsk/services

In my configuration:

  • Nextcloud web server is a subdir of dsk/services dataset.
  • User data is stored in a dsk/nextcloud dataset.
  • Photo previews are in dsk/nextcloud/preview child dataset, it gives me an easy way to skip them in non-recursive ZFS snapshots.
  • Snapshots-as-files are in dsk/backups dataset to be encrypted and uploaded overnight to the public cloud storage.

The main thing to remember is the user and his group under which your webserver runs. All NextCloud directories and files must be owned by this user, see p. 11 in Manual Upgrade Procedure.

As for the external storages available in NextCloud, the administrator configures access for all users, and each user can configure individual access, see Configuring External Storage (GUI).

1 Like

Thanks for your reply :slight_smile: So lets say I want a dataset with Music used by say my Plex container available to all users so that they can upload new music to this directory/dataset. How would I do this?

In other words, you want to ask how Nextcloud and Plex can run concurrently with the same data, while avoiding data corruption?

I would use one of two options:

  1. Nextcloud Group folder to keep shared set of files and Plex read-only access to this directory. The directory itself can be the mount point for a dedicated ZFS dataset with ZFS-level permissions.

I’d recommend read-only option because Plex can silently modify list of files, file attributes, etc., and Nextcloud won’t be aware of this changes. This can lead to serious errors due to mismatch between the Nextcloud file database and the actual state of the file system. To keep Nextcloud database in sync with file system, you need to scan file system periodically (see next option).

  1. Separate ZFS dataset with read/write access for both Nextcloud and Plex.
    This dataset could be mounted as Nextcloud External storage as well as working directory for Plex.
    In this case, it is important to run a specific cron job so that changes made by Plex are discovered by NextClood in a time, see Adding files to external storages.

I use 2nd option, a dedicated ZFS dataset that remote and local users can access via NextClood, plus local Windows PCs can access it using Samba. Plex has access to this dataset at server file system level.