Functions using bitsets

Hi I have been trying to get the following function using bitset structures to work for a while now but I cant seem to get around the segmentation fault. Can anyone point out where I'm going wrong with this:

#define MAXBITCOUNT 210

typedef struct our_bitset{
bitset<MAXBITCOUNT> bo;
} ob;

class bitset_class {
public:
ob binread;
void set_bits();
void conversion(unsigned short int*, int);
};

void bitset_class::set_bits(){
for(int i=0; i<MAXBITCOUNT; i++)
binread.bo.set(i,0);
}
void bitset_class::conversion(unsigned short int* single_read, int seq_length){
int i, j;
bitset<3> temp;

for (i=0; i<seq_length; i++, single_read++){

switch (*single_read){

case 0:
temp.set(0, 0);
temp.set(1, 0);
temp.set(2, 0);
// cout << "0\t" << temp << endl;
break;

case 1:
temp.set(0, 0);
temp.set(1, 0);
temp.set(2, 1);
// cout << "1\t" << temp << endl;
break;

case 2:
temp.set(0, 0);
temp.set(1, 1);
temp.set(2, 0);
// cout << "2\t" << temp << endl;
break;

case 3:
temp.set(0, 0);
temp.set(1, 1);
temp.set(2, 1);
// cout << "3\t" << temp << endl;
break;

case 4:
temp.set(0, 1);
temp.set(1, 0);
temp.set(2, 0);
// cout << "4\t" << temp << endl;
break;
}

for(j=0; j<3; j++){
binread.bo[MAXBITCOUNT-3*(i+1) + j] = temp[j];
}
}
}

int main(){
int i;
bitset_class* bin_reads;
bin_reads = (bitset_class*)malloc(n * sizeof(bitset_class));

for(i=0; i<n; i++)
bin_reads[i].set_bits();

for(i=0; i<n; i++){
bin_reads[i].conversion(r[i], J);
cout<<i+1<<"th read is"<<endl<<bin_reads[i].binread.bo<<endl<<endl;
}
return 0;
}

where r[i] is an array of numbers ranging from 0 to 4, J = 53 and n = 2

Thanks
You cannot, under most circumstances, including yours, use malloc() for classes. malloc() does not run
constructors, and therefore you bitset_class instance is being used uninitialized.
How should I initialize the bitset_class variable? Should I allocate memory for each structure in the class inside the for loop?
I would also appreciate it if you could tell me a simpler method to accomplish this, i.e, convert a 2 dimensional array of integers ranging from 0 to 4 and store them as a series of binary words.
I don't know if n (in main) is constant or not. If it is constant, then you can simply allocate an array of bitsets on
the stack, otherwise use new:

1
2
// Runs default constructor of bitset_class() for each of the n instances created
bitset_class* bin_reads = new bitset_class[ n ];



I need to know what the parameters to conversion() are supposed to be. Is seq_length supposed to be
the number of bytes to convert, the number of unsigned shorts to convert (guessing that short == 2 bytes),
or the number of bits?
I just tried with this initialization and I am still getting a segmentation fault.

n in the main is a constant.

seq_length is just the size of the r[i] array, i.e, above r[i] is an array of 53 unsigned short ints and r is a 2 dimensional array with n rows of seq_length unsigned short int arrays.
So then is conversion supposed to take all the bits from 53 unsigned shorts (53*16 is over 800 bits) and stuff
them into a bitset<> that contains only 210 bits? Or what is conversion supposed to do?
The idea is to convert a series of numbers (stored in r) ranging from 0 to 4 into a bitset that stores all the numbers in their binary form in a single binary word.

For example lets say r is as follows:

1 4 2 0 3 2
2 2 1 1 0 4

Then the two resulting bitsets should be:

001100010000011010 (001 100 010 000 011 010)
010010001001000100 (010 010 001 001 000 100)

I also realised that n is only constant in this case. It is usually taken as an input.
Topic archived. No new replies allowed.