What's the best way to store multiple of the above structure?
Ideally, it would be with a binary output, but that isn't possible with the strings, unless I'm missing something. So that leaves me with two options:
Create a second structure that has an outrageous char[] length, and then copy the data from the original structure to write/read to/from the file. The flaws I see in this is excess runtime with copying, and a larger amount of memory is temporarily needed. The positive I see is I can use binary output, which is easier.
The second option is to just use an ASCII file, and put each element on it's own line, and then just use getline to read each line. The flaw I see in this is that I have to read the file into the structure, instead of just reading the structure and the file can be tampered with (Not a huge concern). The positive is the lack of negative from the above.
So which would you suggest? Or am I missing a better way?
struct Book{
char bookTitle[64]; //63-character string ~ 8 words, assuming there are 7 characters in each word + 1 space
char ISBN[16]; //15-character string, should be enough for isbn 8-13 digit
char author[32]; //31-character string ~ 4 words good enough for full name
char publisher[32]; //31-character string...
char dateAdded[16]; //15-character string, shouldn be more than enough
int qty;
double wholesale;
double retail;
};
sizeof(Book) should be about 180 bytes. You can reinterpret_cast a Book* to char* and write it to a binary file as your 'database' file. You can randomly access a Book record in your database by using seekg(read) or seekp(write).
So the question is, is that the best way, when I could potentially have all my data be half that size? I realize we're dealing with bits here, but I'm still curious if there's a better way to keep the size as small as possible?
I like strings, due to their versatility. Whether it be a 9 character title, or a 4000 character title, they work.
it is very costly to update a record if Book's size is dynamic (not constant), and you cannot randomly access a record.
if you want to keep the size smaller then decrease the length of bookTitle, ISBN, author, ...
1 2 3 4 5 6 7 8 9 10
struct Book{
char bookTitle[41]; //40-character string ~ 5 words, should be enough for book title...
char ISBN[14]; //13-character string, should be enough for isbn 8-13 digit
char author[21]; //20-character string ~ 3 words enough for full name
char publisher[21]; //20-character string...
char dateAdded[11]; //10-character string, just enough for mm/dd/yyyy format
int qty;
double wholesale;
double retail;
};
size of Book is now (theorically) 41 + 14 + 21 + 21 + 11 + 4 + 8 + 8 = 128 bytes
or even more compact
1 2 3 4 5 6 7 8 9 10
struct Book{
char bookTitle[31]; //30-character string ~ 5 average-length-words, just enough for book title?
char ISBN[14]; //13-character string
char author[16]; //15-character string
char publisher[16]; //15-character string...
char dateAdded[11]; //10-character string, just enough for mm/dd/yyyy format
int qty; //4 bytes, you can't do anything about this
double wholesale; //8 bytes, you can't do anything about this
double retail; //8 bytes, you can't do anything about this
};
~108 bytes, but you may lose data for some long author's name, long book title...
or use a database, but when creating a table you will need to specify the length of each string like this, too...
I did realize that when using a database, you have to define cell widths. So I guess the question is, is this better than parsing a file line by line? Ideally, I'd probably do something like
So by using set widths, am I gaining the speed to show for the size usage if I really only needed a 50 byte structure? Taking into account random access, and being able to read straight into the structure.