I am trying to see how to print a report of all the current files in nextcloud. I cant find any resource for this. Any help would be great, this would help debugging
I used php occ files:scan --all but this is just a numeric value rather then printing all the files
im in the nextcloud folder in my server, where theres the core,apps,lib folders and index.html and when i do the command ’ find /admin/files/ -type f’ i get errors , im not sure I even did a simple copy paste of ur command that didnt work either.
It would be possible to help if you showed us the error message.
find path/to/files -type f would list all the files. You could send this output to a file by adding > filename.txt like:
find /part/to/nextcloud/files/ -type f > filename.txt
You did change the path to your actual files did you? Something like:
/var/www/nextcloud/data/Tesla_Pill/files/
, or /var/www/html/nextcloud/data/Tesla_Pill/files/
Well You can printout all files from all users in one list, like:
find /var/www/html/nextcloud/*/files/ -type f > filename.txt
See the * in the path.
But there might be better solutions. But I’m not an expert.
you need to stick your head in some bash scripting.
just as an idea:
check you config.php file for ‘datadirectory’ => ‘/media/nextcloud/data’ ( I mounted the actual user data on a more secure device in /media/nextcloud). This is your… lets call it dataroot for the next steps.
Every User has its own directory in this dataroot. Type in ls -d */ to list them.
now you need to iterate through every folder… i would use some for iteration.
magic
a txt file for every user listing all the data of the user.
Let me try to copy paste some stuff together i found on stack overflow…
#!/bin/bash
# let's set the path to your user data now
userdir=/media/nextcloud/s2/data
#now read all directories in it. every dir represents a user. some are other st$
usernames=`ls -d -- ${userdir}/*/`
#now we run through every userfolder and list the data.
for users in $usernames
do
for eachuser in $users
do
data=`ls -R $eachuser`
for eachdata in $data
do
echo $eachdata >> report.txt
done
done
done
I try to build it a bit better… remember ls -R uses a lot of CPU.
edit: so the last script did something and created one big file. who reads this.
so now I built a better version here:
#!/bin/bash
# let's set the path to your user data now
datadir=/media/nextcloud/s2/data
#now read all directories in it. most dirs represents a user.
#we filter appdata and updater folders.
datacontent=`ls -d -- ${datadir}/*/`
for dirs in $datacontent
do
if [[ $dirs != *"appdata"* ]] && [[ $dirs != *"updater"* ]] #add other filters if needed HERE
then
echo $dirs >> users.txt
fi
done
#now we have a clean user list to rush through and put everyones data in
#a single textfile named after his user and the current date
for item in $(cat users.txt)
do
username=${item%/}
username=${username##*/}
ls -hlR $item >> $username-"`date +"%d-%m-%Y_%H:%M:%S"`".txt
done
#finally we delete the user file. you can comment it out to keep it.
#remember to delete it before the next run of this script or you run twice as long
rm users.txt
edit2: oh i tested it on my Raspbian Nextcloud instance and the second script is waaaay faster. all i needed to do is to use root privileges to run this script. you may use privileges fitting your structure