I am running Home Assistant (hass) on a Raspberry Pi with the Recorder integration. It is able to connect to any database (which needs to be running with hass on boot, else the integration fails) and log sensor data. Usually, this is recorded on the microSD card in an SQLite database. Since I am running it on a Raspberry Pi, I want to be able to use an external database running on a different system in LAN so as to not corrupt the microSD card. But the problem is, the external server may not be available 24/7 (I shut it down every night)
In short, the setup is:
- Raspberry Pi running an instance of Home Assistant, running 24/7
- A repurposed PC running Arch Linux, with PostgreSQL database, which is not available all the time
What I want to achieve is:
- Run an instance of PostgreSQL on the Raspberry Pi, but as a temporary cache (in RAM) till it is able to connect to the external server.
- When it does connect, it has to sync all changes made on the RPi’s local database onto the server’s permanent database and forward all DML commands to the external server directly, flushing the RPi’s RAM.
- When the external server shuts down again, cache all the new data into memory again. It doesn’t need to retain the older flushed data till the external server starts again.
Is it possible to have this kind of setup? If not, what would be the best way to do it?
configuration.yaml of current setup of Home Assistant:
(192.168.1.10 is the IP of the PostgreSQL server)
... recorder: db_url: postgresql://hass:firstname.lastname@example.org:5432/hass purge_keep_days: 30 commit_interval: 300