Synology DSM 6 Photo Station replacement

With the update to DSM 7, Photo Station will be replaced with Synology Moments. Besides the fact that Synology Moments will offer far fewer functionalities, the other downside is that all tags will be lost.

Therefore, the plan is to switch to Immich hosted in a Docker container. In order to preserve the tags, the plan is to export the tags from the PhotoStation database and to add them to the image files in the XMP format and then import the pictures into Immich.

How to "export" the tags from Photo Station 6

The tags are stored in the internal Postgres database.

Connect via SSH to the Synology.

Change to the user postgres

sudo su - postgres

Start the psql command-line tool

psql -U postgres

List all data bases

postgres=# \list

On my system the data is stored in the database named photo.

                                       List of databases
    Name     |           Owner            | Encoding  | Collate | Ctype |   Access privileges   
-------------+----------------------------+-----------+---------+-------+-----------------------
 photo       | PhotoStation               | SQL_ASCII | C       | C     | 

List tables of the database

# change to photo database
postgres=# \c photo
# list tables
photo=# \dt
# quit psql
\q

Now the location of the data is known. In order to not interfere with the current installation, it is best to create a backup and load it in an external PostgreSQL database for further analysis.

# dump the database into a file

# 1. option complete database for usage with PostgreSQL
pg_dump photo > photo_station.sql

# 2. option only table and data for usage with SQLite
pg_dump -s photo | awk 'RS="";/CREATE TABLE[^;]*;/' < photo_station_create.sql
pg_dump --column-inserts --data-only  photo > photo_station_inserts.sql

# find the folder where the file will be located
pwd
#-> /var/services/pgsql

The file(s) do contain SQL statements to create and fill the photo station database to the current state. The files can be copied to a folder accessible from another pc.

sudo su
mv /var/services/pgsql/photo*.sql /volume1/xxxxxxxx
cd /volume1/xxxxxxxx
chown xxxx *
chgrp users *

Add tags to the images as meta data

Import data in SQLite data base

Adjust files to be compatible with sqlite3

in photo_station_create.sql

  • replace "public." with ""
  • replace "::text" with ""
  • replace "::bpchar" with ""
  • replace "::text" with ""
  • replace "::character varying" with ""
  • replace "now()" with"CURRENT_TIMESTAMP"
  • remove all lines wiht "CONSTRAINT"

in photo_station_inserts.sql

  • replace "public." with ""
  • remove all lines starting wiht "SET"
  • remove all lines starting wiht "SELECT pg_catalog"

Create and fill database

sqlite3 photo-station.db < photo_station_create.sql
sqlite3 photo_station.db < photo_station_create.sql 

Collect data required for tagging

Now it is time to analyze the data base and find the relevant data.

  • photo_image

    • id
    • path (absolute path including file name)
    • name (file name)
    • title
    • description
    • resolutionx (pixel)
    • resolutiony (pixel)
    • timetaken (time when the image was taken from EXIF)
    • creation_time (file creation time)
    • gps
    • lat
    • lng
  • photo_image_label

    • id
    • image_id -> photo_image.id
    • label_id -> photo_label.id
    • info_new
      • # either data in this format:
          {"face":{"height":0.09900409728288651,"width":0.1320312470197678,"x":0.425000011920929,"y":0.2899824380874634},"height":0.10450160771704,"width":0.13933547695606,"x":0.42229367631297,"y":0.28938906752412}
        # or in this format
          {"x":0.1441717791411,"y":0.21165644171779,"width":0.12806748466258,"height":0.2678936605317}
        
    • status
  • photo_label

    • id
    • name
    • category 0 and 2 are used
  • photo_video_label

    • id
    • video_path
    • label_id
    • status
  • video

    • id
    • path (absolute path including file name)
    • title
  • video_desc

    • id
    • path (absolute path including file name)
    • title

SQLite3 query to get the images and their labels and face detections.

SELECT photo_image.path, photo_image.name as "name", photo_image.description as "description", 
	   photo_label.name as "label_name", photo_label.category as "label_category", photo_image_label.info_new as "data" 
	   FROM photo_image 
	   INNER JOIN photo_image_label ON photo_image_label.image_id = photo_image.id 
	   inner join photo_label on photo_label.id = photo_image_label.label_id  order by path

Adding the tags to the images

NOTE: Exif data can also be written via Python to the image files, but as usage of piexif is not so well documented, I prefer to use the command-line tool exiftool. For Arch Linux users, it can be installed via the package perl-image-exiftool.

Python script to extract region tags and tags from Photo Station and to attach them as exif data to the images

import PIL.Image
import sqlite3
import shutil
import smbc
import os
import json
import subprocess
from pathlib import Path
from dateutil import parser


inputDir = "smb://xxxxx/photo/"
outputdir = Path("/home/xxxxxxx/SynologyNAS/photo-station/converted/")
dbFileName = "photo_station.db"
def auth_fn(server, share, workgroup, username, password):
    return ("", "user_name", "password")


def copy_image_fn(image):
    relativePath = image[0].replace("/volume1/photo/", "")
    inputPath = inputDir + relativePath
    outputPath = outputdir.joinpath(relativePath)
    print("\n\nProcessing: " + inputPath + " -> " + outputPath.absolute().as_posix())
    inFile = ctx.open(inputPath, os.O_RDONLY)
    os.makedirs(os.path.dirname(outputPath.absolute().as_posix()), exist_ok=True)
    outFile = open(outputPath, 'wb')
    outFile.write(inFile.read())
    outFile.flush()
    inFile.close()    
    outFile.close()
    return outputPath

def set_creation_and_updatetime_fn(filePath, creationAndUpdateTime):
    os.utime(filePath,(creationAndUpdateTime,creationAndUpdateTime))
    return

connection = sqlite3.connect(dbFileName)
cursor1 = connection.cursor()
cursor2 = connection.cursor()
ctx = smbc.Context(auth_fn=auth_fn)

tagedImages = cursor1.execute("""SELECT photo_image.path, photo_image.id, photo_image.gps, photo_image.name, photo_image.description,
                                photo_image.resolutionx, photo_image.resolutiony, timetaken, create_time
                                FROM photo_image 
                                ORDER BY photo_image.id
                            """)

for image in tagedImages:
    # copy images
    outputPath= copy_image_fn(image)
    # get variables
    gps = image[2]
    name = image[3]
    descritpion = image[4]
    width = image[5]
    heigth = image[6]
    creationTime =  image[8] if image[7] == "1970-01-01 00:00:00" else image[7] 
    
    # get tags
    print(  "   Get tags for image ID: " + str(image[1]) + "\n    width: " + str(width) + "\n    heigth: " + str(heigth)  + "\n    GPS: " + gps)
    params = (image[1],)
    tags = cursor2.execute("""SELECT photo_label.name, photo_label.category, photo_image_label.info_new , photo_image_label.image_id
	                        FROM photo_image_label
	                        INNER JOIN photo_label on photo_label.id = photo_image_label.label_id
	                        WHERE photo_image_label.image_id = ?
                            """, params)
    # generate XMP-mwg-rs:RegionInfo
    xmpRegioninfo = "{\nAppliedToDimensions = {\n  W=" + str(width) +",\n  H=" +str(heigth) + ",\n  Unit=pixel,\n},\n  RegionList = \n  [\n"
    # generate XMP-dc:Subject
    xmpDcDescription =""
    firstRegionTag = True
    for tag in tags:
        tagCategory = tag[1]
        tagName =  tag[0]
        if(tagName == ""):
            continue
        print(  "   Applying tag: " + tagName + "\n      category: " + str(tagCategory) + "\n      data: " + tag[2])
        if tagCategory == 0:
            print(  "   Tag category: face tag, Tag: " +  tagName)
            # region can be stored in two formats, in the second case the coordinates stored in the face section must be used
            # {"x":0.28480509148767,"y":0.3372641509434,"width":0.054097056483691,"height":0.069575471698113}
            # {"face":{"height":0.3335937559604645,"width":0.2501464486122131,"x":0.719976544380188,"y":0.530468761920929},"height":0.32822085889571,"width":0.24616564417178,"x":0.72239263803681,"y":0.53578732106339}
            regionData = json.loads(tag[2])
            #if "face" in regionData:
            #    regionData = regionData["face"]
            #    print("   Using explicit face tag: " + str(regionData))
            if not firstRegionTag :
                xmpRegioninfo +=","
            # x and y position has to be corrected as origin used by photo station differs from XMP standard
            correctedX = regionData["x"] + regionData["width"] * 0.5
            correctedY = regionData["y"] + regionData["height"] * 0.5
            xmpRegioninfo += "    { Area = {\n      W = " + str(regionData["width"]) + ", H = " + str(regionData["height"]) + ", X = " + str(correctedX) + ", Y = " + str(correctedY) + ",\n"
            xmpRegioninfo += "      Unit=normalized, \n}," 
            xmpRegioninfo += "    Name=" + tagName + ",Type=Face}"
            firstRegionTag = False
            # additional adding the names to the subsicrtion
            if xmpDcDescription != "":
                xmpDcDescription += ","    
            xmpDcDescription += tagName
        if tagCategory == 2:
            print("   Tag category: dc description, Tag: " +  tagName)
            if xmpDcDescription != "":
                xmpDcDescription += ","    
            xmpDcDescription += tagName
    xmpRegioninfo += "\n  ],\n}\n"

    xmpInflieName = outputPath.absolute().as_posix() + ".XMPIN"
    xmpInflie = open(xmpInflieName, "w")
    xmpInflie.write(xmpRegioninfo)
    xmpInflie.close()
    exifToolArgs =""

    if(outputPath.suffix != ".BMP" and outputPath.suffix != ".bmp"):
        exifToolArgs = "-overwrite_original \"-RegionInfo<=" + xmpInflieName + "\"" 
        if xmpDcDescription != "":
            exifToolArgs += " -XMP-dc:description=\"" + xmpDcDescription + "\""   
        exifToolArgs += " \"" + outputPath.absolute().as_posix() + "\""
        exifToolCommand = "exiftool"
        print(exifToolCommand + " " + exifToolArgs)
        returnCode = os.system(exifToolCommand + " " + exifToolArgs)
        if returnCode != 0: 
            print("exiftool return code:" + str(returnCode));
            break;
    
    set_creation_and_updatetime_fn(outputPath, parser.parse(creationTime).timestamp())
    
    
    

Installation of Immich in a Docker container

Preparation

NOTE: In order to use Docker Compose, Synology Container Manager must be used instead of Docker. Container Manager replaces Docker and is available on DSM 7.2 and higher. From 7.1 to 7.2 for my DS918+, the update has to be performed manually.

Create a directory for your Immich installation. As a good practice, the suggestion is to create the directory in the docker folder.

mkdir /volume1/docker/immich

Create the directories for the Postgres database and the photo library

mkdir /volume1/docker/immich/postgres
mkdir /volume1/docker/immich/library

Download docker-compose.yml and example.env to your computer. Upload the files to the /Volume1/docker/immich directory, and rename example.env to .env. Note: If you want to use the Synology Text Editor to edit the .env file on the NAS within File Station, you need to rename it to a temporary name (e.g. env.txt).

Edit the .env file. Define a custom DB_PASSWORD and set the folder to the location where the library and the data shall be stored as well as the time zone.

# The location where your uploaded files are stored
UPLOAD_LOCATION=/volume1/docker/immich/library

# The location where your database files are stored. Network shares are not supported for the database
DB_DATA_LOCATION=/volume1/docker/immich/postgres

Create the container using Container Manager

Open the container manager and create a new project. Select the created immich base folder (/Volume1/docker/immich). The docker-compose.yml will be detected, and an option is displayed to use this file. The file is now opened in an editor to verify its content; click next to continue.

For now skip the setup of the WebStation and continue with the creation and start of the container. As the not-yet-secured container shall not be accessible from the web.

Adjust firewall to access the container

Once your containers have started, navigate to the "Container" section of Container Manager, right-click on the "immich-server" container, and choose the "Details". Note down the IP Address listed in the Network section.

Open "Control Panel" on your Synology NAS, select "Security" and navigate to "Firewall". Click "Edit Rules" and add the following firewall rules.

  • Add a "Source IP" rule for the IP address of your container that you obtained in the step above
  • Add a "Ports" rule for the port specified in the docker-compose.yml, which should be 2283

Immich can now be accessed via HTTP on port 2283 of your NAS. NOTE: This connection is not encrypted and shall only be used in a secure network!

For secure access via HTTPS, WebStation can be used. Therefore, enable the WebPortal settings for the immich project in the container manager. Select the port defined in the Docker file and HTTP for the internal communication.

Then, choose a port-based server portal in the Web Station dialog, enable HTTPS, and define a custom port that now can be used to access Immich using the certificate installed in the Web Station.

Import of the prepared images and videos

Preparations

  • Enable face input in the "Metadata Settings"

Veröffentlicht in Aktuelles, EDV am 10. February 2026