initializing bitfield inside a structure

I found what was the problem but I dont understand why exactly it was a problem. So here it is.
I was having this bitfield and unused bit type was int
1
2
3
4
5
6
7
8
9
typedef struct _bitfield_flags
{
	unsigned char read : 1;
	unsigned char write : 1;
	unsigned char unbuf : 1;
	unsigned char eof : 1;
	unsigned char err : 1;
	int : 3; //unused bits
} bitfield_flags;


Now I tryed to initialize this struct my_FILE and bitfield_flags was part of it

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
#define OPEN_MAX 20   //nr. of files that can be opened simultaneously

typedef struct _my_iobuf {
	int cnt;	
	char* ptr;      
	char* base;     
	bitfield_flags bit_flag; /*mode of file access but using bitfields*/
	int fd;         /*file descriptor*/
	
} my_FILE;


my_FILE my_iob[OPEN_MAX] = {
	{ 0, (char*)0, (char*)0,{ 1,0,0,0,0 } ,0},
	{ 0, (char*)0, (char*)0,{ 0,1,0,0,0 } ,1},
	{ 0, (char*)0, (char*)0,{ 0,1,1,0,0 } ,2}
};


The problem with this was that in my_iob[0], my_iob[1], my_iob[2] bitfield in was initialized to { 1,0,0,0,0 } in all 3 cases and my last parameters for File descriptors that were {0, 1, 2} were now {0, 0, 3}

All I had to do was to change
int : 3; //unused bits to
unsigned char : 3; //unused bits
and all was good

Can anyone explain this weird behavior?
Last edited on
closed account (48bpfSEw)
Hi, never used this syntax with bits for a type but I think it has something to do with the size of the integer type. What do you get for sizeof(int) with your compiler? Did you isolate the problem and test it separately?

1
2
3
4
5
6
7
8
9
10
11
12
13
typedef struct _bitfield_flags {
	int : 3;
} bitfield_flags;


bitfield_flags bf0 = {0};
out << bf0;

bitfield_flags bf1 = {1};
out << bf1;

bitfield_flags bf2 = {2};
out << bf2;


Hi man, my sizeof(int) is 4 but I don't think its possible to assign some value to unnamed bit field like that.
closed account (48bpfSEw)
Hi ^^

may be you can figure it out the problem when you convert your source code in assembler:

https://assembly.ynh.io/

This is the output of your code:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
             	.Ltext0:
             		.globl	my_iob
             		.data
             		.align 32
             	my_iob:
0000 00000000 		.long	0
0004 00000000 		.zero	4
0008 00000000 		.quad	0
     00000000 
0010 00000000 		.quad	0
     00000000 
0018 01       		.byte	1
0019 000000   		.zero	3
001c 00000000 		.long	0
0020 00000000 		.long	0
0024 00000000 		.zero	4
0028 00000000 		.quad	0
     00000000 
0030 00000000 		.quad	0
     00000000 
0038 02       		.byte	2
0039 000000   		.zero	3
003c 01000000 		.long	1
0040 00000000 		.long	0
0044 00000000 		.zero	4
0048 00000000 		.quad	0
     00000000 
0050 00000000 		.quad	0
     00000000 
0058 06       		.byte	6
0059 000000   		.zero	3
005c 02000000 		.long	2
0060 00000000 		.zero	544
     00000000 
     00000000 
     00000000 
     00000000 
             		.text

             	.Letext0:
Tnx man I would love if I would know assembly but right now it wont help me :D
Hi,

How about std::bitset ?
http://en.cppreference.com/w/cpp/utility/bitset


They are much better than a C style bitset which has endian issues. One also has the usual C++ features.
closed account (48bpfSEw)
"do what you love!" ^^
start learning assemlby:

http://www.tutorialspoint.com/assembly_programming/index.htm

I'll learn assembly for sure but I have some other things to learn before it :)
About the bitset - I cant actually use it because this is taken from C program :D I just like this forum the most and C++ also supports bit fields so I decided to ask here.

Mostly I just wanted to know why is that int : 3 gives this weird behavior but unsigned char : 3 works as I expected even though they are both just unused 3 bits. So they are not even used but are giving different results.

Probably yea only some assembly code could help to see whats going on there but that's the task for the future.

They are much better than a C style bitset which has endian issues.

So basically you are saying I just should use bitwise operations instead of bitfields in C? I dont like those fields too much anyways :D
unsigned char : 3 will continue to allocate bits in the current char variable.

int : 3 will allocate a fresh variable of type int, and assign the first three bits as requested.

Also, because the int will usually be aligned on a particular boundary for efficient access, several chars will be left unused.

On my 32-bit system, that increases the size of the struct to 1 + 3(padding) + 4 = 8 bytes.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
#include <stdio.h>

struct
{
    unsigned char read  : 1;
    unsigned char write : 1;
    unsigned char unbuf : 1;
    unsigned char eof   : 1;
    unsigned char err   : 1;
    int                 : 3; //unused bits
} bitfield_flags1;

struct
{
    unsigned char read  : 1;
    unsigned char write : 1;
    unsigned char unbuf : 1;
    unsigned char eof   : 1;
    unsigned char err   : 1;
    unsigned char       : 3; //unused bits
} bitfield_flags2;

int main()
{
    printf("size of bitfield_flags1 = %d \n", sizeof (bitfield_flags1));
    printf("size of bitfield_flags2 = %d \n", sizeof (bitfield_flags2));
}

My output:
size of bitfield_flags1 = 8
size of bitfield_flags2 = 1
I don't know exactly what you're trying to do, but if you change the type in a bitfield you create a new (uninitialized variable). So the first unsigned char is initialized while the second int is not.

I reccomend using bitwise operators instead of the slow bit fields.

See:
http://www.cplusplus.com/doc/tutorial/operators/
Allocation of bit-fields within a class object is implementation-defined. Alignment of bit-fields is implementation-defined. Bit-fields are packed into some addressable allocation unit. [Note: Bit-fields straddle allocation units on some machines and not on others. Bit-fields are assigned right-to-left on some machines, left-to-right on others. —end note ]


> bitfield in was initialized to { 1,0,0,0,0 } in all 3 cases and my last parameters for File
> descriptors that were {0, 1, 2} were now {0, 0, 3}

Can't reproduce this behaviour with any compiler that I tried.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
#include <stdio.h>

typedef struct /*_bitfield_flags*/ bitfield_flags {

    unsigned char read : 1;
	unsigned char write : 1;
	unsigned char unbuf : 1;
	unsigned char eof : 1;
	unsigned char err : 1;
	int : 3; //unused bits
} bitfield_flags;

typedef struct /*_my_iobuf*/ my_FILE {

	int cnt;
	char* ptr;
	char* base;
	bitfield_flags bit_flag; /*mode of file access but using bitfields*/
	int fd;         /*file descriptor*/

} my_FILE;

#define OPEN_MAX 20

my_FILE my_iob[OPEN_MAX] = {

	{ 0, (char*)0, (char*)0,{ 1,0,0,0,0 } ,0},
	{ 0, (char*)0, (char*)0,{ 0,1,0,0,0 } ,1},
	{ 0, (char*)0, (char*)0,{ 0,1,1,0,0 } ,2}
};

void print( bitfield_flags flags, int fd ) {

   printf( "{ %u, %u, %u, %u, %u }, %d\n", flags.read, flags.write, flags.unbuf, flags.eof, flags.err, fd ) ;
}

int main()
{
    printf( "sizeof(bitfield_flags):%llu\n\n", (unsigned long long)sizeof(bitfield_flags) ) ;
    
    for( int i = 0 ; i < 3 ; ++i ) print( my_iob[i].bit_flag, my_iob[i].fd ) ;
    puts("") ;
}


sizeof(bitfield_flags):1

{ 1, 0, 0, 0, 0 }, 0
{ 0, 1, 0, 0, 0 }, 1
{ 0, 1, 1, 0, 0 }, 2

http://coliru.stacked-crooked.com/a/3d1ac92549c0f452

Note: sizeof(bitfield_flags) is 1 on Linux (LLVM,GNU) and 8 on Windows (Microsoft, LLVM and GNU)
Thank you very much guys, exactly what I was looking for! :)
I was doing this on VS 2015 like always and I got that behavior
Last edited on
> I was doing this on VS 2015 like always and I got that behavior

I checked it on VS 2015 (both Microsoft and LLVM front-ends, both 32-bit and 64-bit targets) and got expected results.
sizeof(bitfield_flags):8

{ 1, 0, 0, 0, 0 }, 0
{ 0, 1, 0, 0, 0 }, 1
{ 0, 1, 1, 0, 0 }, 2

Can't show it online; rextester is baulking.
I just runed your code and here is what I got
sizeof(bitfield_flags):8

{ 1, 0, 0, 0, 0 }, 0
{ 1, 0, 0, 0, 0 }, 0
{ 0, 0, 0, 0, 0 }, 0


I dont know whats wrong with my VS :D

PS. Ohh maybe there is something to do with the fact that when I create my C programs I create main.c file instead of main.cpp?
Last edited on
> I dont know whats wrong with my VS :D

What does this print out?

1
2
3
4
5
6
#include <iostream>

int main()
{
    std::cout << _MSC_FULL_VER << '\n' ;
}

It prints 190023026
So it has nothing to do that I runed your program in main.c instead of main.cpp?
OK I just runed your program in main.cpp and it worked fine! So try running it in main.c and you will probably see the behavior I was getting.
Last edited on
> It prints 190023026
> OK I just runed your program in main.cpp and it worked fine!

In any case, update the compiler (install Update 2). 190023026 is the old compiler.


> So try running it in main.c and you will probably see the behavior I was getting.

In general, with the Microsoft tool-chain, use the C++ compiler, even for C code.
See: http://www.cplusplus.com/forum/general/145334/#msg765455

Or to compile pure C in Visual Studio, use the 'Clang with Microsoft CodeGen' tool-chain.
Tnx man for tips and info as always :)
Topic archived. No new replies allowed.