📦
OpenHIM Platform
  • OpenHIM Platform
  • Getting Started
  • 📜Recipes
    • Central Data Repository with Data Warehousing
    • Central Data repository (no reporting)
    • Master Patient Index
  • 📦Packages
    • Interoperability Layer Openhim
      • Local Development
      • Environment Variables
    • Monitoring
      • Environment Variables
    • Kafka Mapper Consumer
      • Local Development
      • Environment Variables
    • Data Mapper Logstash
      • Local Development
      • Environment Variables
    • Job Scheduler Ofelia
      • Local Development
      • Environment Variables
    • Analytics Datastore - Clickhouse
      • Local Development
      • Environment Variables
    • Analytics Datastore - Elasticsearch
      • Local Development
      • Running in Clustered Mode
      • Environment Variables
    • Client Registry - SanteMPI
      • Environment Variables
    • Dashboard Visualiser - Jsreport
      • Local Development
      • Environment Variables
    • Dashboard Visualiser - Kibana
      • Local Development
      • Environment Variables
    • Dashboard Visualiser - Superset
      • Local Development
      • Environment Variables
    • Message Bus - Kafka
      • Local Development
      • Environment Variables
    • FHIR Datastore HAPI FHIR
      • Local Development
      • Environment Variables
    • Kafka Unbundler Consumer
      • Environment Variables
    • Message Bus Helper Hapi Proxy
      • Environment Variables
    • Reverse Proxy Nginx
      • Local Development
      • Environment Variables
    • OpenFn
      • Environment Variables
    • Reverse Proxy Traefik
      • Environment Variables
  • 🗒️Cheat sheet
  • Architecture
  • Guides
    • Provisioning remote servers
      • Ansible
      • Terraform
    • Resource Allocations
    • Disaster Recovery Process
      • Elasticsearch
      • HAPI FHIR Data
      • OpenHIM Data
    • Development
      • Config Importing
    • Performance Testing
  • Community
Powered by GitBook
On this page

Was this helpful?

Edit on GitHub
Export as PDF
  1. Guides
  2. Disaster Recovery Process

OpenHIM Data

OpenHIM backup & restore

PreviousHAPI FHIR DataNextDevelopment

Last updated 1 year ago

Was this helpful?

OpenHIM transaction logs and other data is stored in the Mongo database. Restoring this data means restoring all the history of transactions which mandatory to recover in case something unexpected happened and we lost all the data.

In the following sections, we will cover:

  • Already implemented jobs to create backups periodically

  • How to restore the backups

Backup & Restore

Single node

The following job may be used to set up a backup job for a single node Mongo:

[job-run "mongo-backup"]
schedule= @every 24h
image= mongo:4.2
network= mongo_backup
volume= /backups:/tmp/backups
command= sh -c 'mongodump --uri=${OPENHIM_MONGO_URL} --gzip --archive=/tmp/backups/mongodump_$(date +%s).gz'
delete= true

Cluster

The following job may be used to set up a backup job for clustered Mongo:

[job-run "mongo-backup"]
schedule= @every 24h
image= mongo:4.2
network= mongo_backup
volume= /backups:/tmp/backups
command= sh -c 'mongodump --uri=${OPENHIM_MONGO_URL} --gzip --archive=/tmp/backups/mongodump_$(date +%s).gz'
delete= true

Restore

In order to restore from a backup you would need to launch a Mongo container with access to the backup file and the mongo_backup network by running the following command:

docker run -d --network=mongo_backup --mount type=bind,source=/backups,target=/backups mongo:4.2

Then exec into the container and run mongorestore:

mongorestore --uri="mongodb://mongo-1:27017,mongo-2:27017,mongo-3:27017/openhim?replicaSet=mongo-set" --gzip --archive=/backups/<NAME_OF_BACKUP_FILE>

The data should be restored.

Single node restore docs
Cluster restore docs