writing bitset into binary file!!

Jul 16, 2013 at 3:39pm
Hi, I am trying to find a way to write a bitset into binary file!!
I tried the following code!!
1
2
3
bitset<10> first("101011");
ofstream output("binary.bin",ios::binary);
output<<first;

This code outputs a file with 101011 but it was not a binary file.
I thought about using write function to output the binary file, but it takes char pointer and the length. Is there any way to solve this problem?? By the way, the file size must be compressed. I tried to output binary file with string but the size was increased.. This is for Huffman encoding and decoding!!
Thanks!
Jul 16, 2013 at 4:01pm
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
std::bitset<10> first( "101011" ) ;
static_assert( first.size() <= std::numeric_limits<unsigned long>::digits, "too many bits" ) ;

// write
{
    std::ofstream output("binary.bin",std::ios::binary);
    unsigned long n = first.to_ulong() ;
    output.write( reinterpret_cast<const char*>(&n), sizeof(n) ) ;
}

// read
{
    std::ifstream input("binary.bin",std::ios::binary);
    unsigned long n ;
    input.read( reinterpret_cast<char*>(&n), sizeof(n) ) ;
    first = n ;
}

Jul 16, 2013 at 4:03pm
By the way, the file size must be compressed


This is a whole other can of worms.

By 'compressed' I assume you mean it's bitpacked so that there is no padding between bitsets in the file.


You will not be able to do this in a single function call. At least not without writing your own.

Files work with 8-bit bytes. You cannot write any value smaller than a single byte. So if you have a 10-bit value, that is going to occupy 16 bits (2 bytes) in the file. The extra 6 bits are "padding".

If you want to remove that padding, you'll have to run all i/o through some kind of class or function which takes bits out of the bitset, and writes them to the file only after it has accumulated 8 bits of data.

pseudo-code:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
int bitswritten = 0;
unsigned char bitbuffer = 0;

void write(ostream& str, const bitset& bs)
{
    while( bits_remaining_in_bs )
    {
        int bitstoextract = 8 - bitswritten;

        bitbuffer |= extract_bits_from_bitset( bs, bitstoextract );
        
        if(bitswritten == 8)
        {
            str.write(&bitbuffer,1);
            bitswritten = 0;
            bitbuffer = 0;
        }
    }
}
Jul 16, 2013 at 5:02pm
closed account (Dy7SLyTq)
Files work with 8-bit bytes


nitpicking... but isnt a byte by definition 8 - bits?
Jul 16, 2013 at 5:04pm
yes and no. Technically it's platform specific.... but pretty much any platform these days is going to have 1 byte=8 bits.
Jul 16, 2013 at 5:06pm
closed account (Dy7SLyTq)
good to know. so not to derail, but when i need to work at that level is it safe to assume the standard of 4 bits = nibble, 2 nibbles = 1 byte, etc, etc? or should i run a test somehow?
Jul 16, 2013 at 6:16pm
closed account (zb0S216C)
@DTSCode: I wouldn't worry too much about a byte being less/greater than 8-bits in length unless your software targets some specialised hardware. You can perform a check to validate the length of a byte, but 9 times out-of 10 the length of a byte will be 8-bits.

Even so, if you did target some specific piece of hardware, you'd read the hardware's corresponding specification and check the size of a byte. If the specification does not state the size of a byte, you have two options:

1) assume a byte on the targeted hardware has a size of 8-bits or...
2) contact the hardware's vendor and inquire about it.

Wazzak
Last edited on Jul 16, 2013 at 6:21pm
Topic archived. No new replies allowed.