I am currently working with a system which has been upgraded piecemeal from an original Visual FoxPro solution, to a system that now has the following parts:
Local FoxPro installation (this is a Point Of Sale system, so designed to be used on touchscreens in stores / salons)
Local windows service which syncs data from the local Foxpro database into a remote PostgreSQL DB over a series of REST APIs. This loop both pushes data and also checks for new data (which can come from online booking systems for example)
An online SaaS style portal which is backed by the central PostgreSQL DB and allows for a suite of additional functionality over and above the local install - dashboards, detailed reporting, marketing, online bookings amongst any others.
The final stage of this project is replace the Foxpro system itself with a hosted solution. Ideally this POS would not fall over if the internet dropped out and would also support multiple terminals, and after a lot of research and testing of frameworks I've settled on Meteor as an ideal approach for this. It handles the reactive updating between terminals, minimongo seems to provide sufficient resilience against temporary internet outages, and overall fits the bill.
The architectural decision I am struggling with is with the remote database. I have knocked up a PoC project with Angular2 and Mongo, with a hosted Mongo remote DB and it works. I now need the data to be 2 way synced into the PostgreSQL DB which leaves me with 2 options:
Sync the remote Mongo DB with PostgreSQL (either over the existing REST APIs or similar)
Work with one of the experimental packages and try to use my existing PostgreSQL DB as the backend, removing the need for the 'interim' Mongo DB.
My instinct is to take the first approach, it feels more robust and I already have the APIs in place, however without having a lot (almost zero) Mongo experience, am I going about this the wrong way? And if this is the right way, is there a best-paractice approach to syncing Mongo like this?