Database, binary files or both?

I'm preparing to write a program that will be on many different machines and have many concurrent users. These machines are typically connected to a network.

Data will need to be saved to the network by any of the users and the saved data should be accessible to any of the users.
My current plan is to use a database on the network server but I am open to other suggestions.

On demand, this program should be able to duplicate all saved data in the network database to a user's local machine for use when the user leaves the site and works elsewhere where there is no connection to the network.
My initial thought was to create a binary file on the local machine for each of the database tables. However this means I would have to write a parser to transfer the data from the tables to the binary files. It also means the program needs to be able to read/write both from/to the database and the binary files interchangeably.

At this point I'm just looking for recommendations.
Should I avoid using a database all together and just use binary files that can be easily copied between clients and server? There will be frequent read & write operations to the server files so I'm seeing potential "concurrent user" issues with locked files using this method.
I'd actually like to skip the binary files and instead copy the database to the client machines. However, I've had no experience in working with databases in C++ so any suggestions or links to related topics would be greatly appreciated.

Thanks!
I would recommend a database since you have multiple users accessing the data at the same time.

I would be careful about letting the users modify the data when not connected to the database because you will likely have problems with "stale" data. Remember you have multiple users and they may want to try to modify the same piece of data. If they are disconnected from the database there is no way of telling, until they reconnect, if another user has already modified this same dataset.

If it is a requirement to be able to use the data when disconnected from the database I would recommend using "read only" data when disconnected.
jlb, Thanks for the reply. My plan was to create temporary (local only) duplicate tables for any "off-line" changes that get saved. The other tables would remain read-only as you've stated. Then when the user returns, I was going to write another procedure to handle synchronization with the "master" tables. Conceptually it would work very similar to file sync where it checks the edit date for each record. If matching records have already been added or changed in the master database then they would take precedence and the newly returned user would be notified that their new items and changes need review before saving to the "master" database.

However, I'm still unsure how to 1) Connect to the new database created for this program and 2) go about creating the local database and duplicating the master to it.

Our ERP system runs on a Pervasive v11 database. If at all possible I'd like to create the database for this new program in Pervasive as well so we could potentially access the data with the ERP system in the future. However, I thought I saw somewhere in the pervasive documentation that if you want a local pervasive database, you need to have the server software running on the machine (which requires another license). So my thought was to have the server database in pervasive and the local databases in another (free) format. Is this feasible? Do you have any suggestions on what database type I can use for local instances?

I found the Pervasive SDK downloads here (http://www.pervasive.com/database/Home/Products/PSQLv11.aspx) at the bottom of the page. However, I'm not sure which ones I need. I'm using VS2013 Community.
Topic archived. No new replies allowed.