Between Christmas and new year, I decided to move my Nextcloud server to new hardware. The RaspberryPi-Octopus with his many external USB hard drive - tentacles was in use long enough, and it was time to upgrade.
From Docker Compose to TrueNAS
I installed TrueNAS on the new machine. The hardware setup is roughly this:
- 1 SSD with the OS
- 1 SSD for apps and databases
- 3 HDDs for data in RAID-5
I set up storage accordingly and installed the Nextcloud app. All of this was pretty straight forward. I also activated the ssh service.
Now I wanted to transfer the data from the old installation. This was a bit more complicated than I thought:
- Copy the files in
/data/
from the old to the new installation - Convert the database from mysql to postgres (mysql is recommended in the nextcloud documentation, but postgres is used by the TrueNAS app)
- Make a backup of the database
- Restore the database backup on the new installation
- Get certificates
Copying the data
This is fairly straight forward:
rsync -Aaxt ./old_data truenas_admin@truenas:/path/to/new/data
I would advise to run this in tmux
, so it can run in the background.
I decided to not copy the whole config file. Instead, I just copied the passwordsalt
and secret
values.
Converting the database
I created a new database using docker
and exposed the port to the host.
docker pull postgres
docker run --name some-postgres -e POSTGRES_PASSWORD=password -p 5432:5432 -d postgres
Now, there are two ways I found to move the data:
- Using
occ db:convert-type
. This was command was currently broken and disabled by the devs. Basically create a docker network, connect the db and nextcloud app containers to that network and run the command. - Using
pgloader
, as explained below.
In the postgres shell, I created the database for nextcloud with the same username and credentials as used in the TrueNAS app.
CREATE DATABASE nextcloud;
CREATE USER nextclouduser WITH PASSWORD 'password';
GRANT ALL PRIVILEGES ON DATABASE nextcloud TO nextclouduser;
ALTER DATABASE nextcloud OWNER TO nextcloud;
I then put nextcloud into maintenance mode and finally moved the data from the mysql database to postgres:
# activate maintenance mode
docker exec -t --user www-data nextcloud-docker-app-1 php occ maintenance:mode --on
# copy the data from mysql to postgres
pgloader mysql://nextcloud:password@localhost/nextcloud postgresql://nextcloud:password@localhost/nextcloud
Backup the database
Make a backup of the postgres database and copy it to the TrueNAS machine.
# dont use -f, since that will create a file inside the container
pg_dump nextcloud -h localhost -U nextcloud > backup.bak
rsync ./backup.bak truenas_admin@truenas:/mnt/path/NextcloudPostgres/backup.bak
Now on the TrueNAS system, change the owner of the db dump:
chown netdata:docker backup.bak
Finally, restore the backup. Put nextcloud on TrueNAS into maintenance mode, then connect to the postgres shell and run:
psql -h localhost -U nextcloud -d template1 -c "DROP DATABASE \"nextcloud\";"
psql -h localhost -U nextcloud -d template1 -c "CREATE DATABASE \"nextcloud\";"
psql -h localhost -U nextcloud -d nextcloud -f backup.bak
The tables inside the container needed some ownership changes, which I did with a script generated by ChatGPT.
Getting new certificates
I registered my domain at Porkbun, so I needed a 3rd party script in order to request a certificate using the TrueNAS UI. However, this script didn’t work, as the path to the script had spaces inside. After I found that out, I successfully got a certificate and selected that in the Nextcloud app.
I then install the nginx
proxy manager app and set up my domains. I changed the ports of the TrueNAS webui to something else
and exposed nginx on 80
and 443
.
Now I needed certificates in nginx. I added my domain to the hosts file for that (inside the nginx app):
echo "192.168.0.x example.com" >> /etc/hosts
Finally done!