Converting arrays to bitsets

Hi I have been trying to write a code that does the following:


The idea is to convert a series of numbers (stored in r) ranging from 0 to 4 into a bitset that stores all the numbers in their binary form in a single binary word.

For example lets say r is as follows:

1 4 2 0 3 2
2 2 1 1 0 4

Then the two resulting bitsets should be:

001100010000011010 (i.e, 001 100 010 000 011 010)
010010001001000100 (i.e, 010 010 001 001 000 100)


The code I have is below:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
#define MAXBITCOUNT 210

typedef struct our_bitset{
bitset<MAXBITCOUNT> bo;
} ob;

class bitset_class {
public:
ob binread;
void set_bits();
void conversion(unsigned short int*, int);
};

void bitset_class::set_bits(){
for(int i=0; i<MAXBITCOUNT; i++)
binread.bo.set(i,0);
}
void bitset_class::conversion(unsigned short int* single_read, int seq_length){
int i, j;
bitset<3> temp;

for (i=0; i<seq_length; i++, single_read++){

switch (*single_read){

case 0:
temp.set(0, 0);
temp.set(1, 0);
temp.set(2, 0);
// cout << "0\t" << temp << endl;
break;

case 1:
temp.set(0, 0);
temp.set(1, 0);
temp.set(2, 1);
// cout << "1\t" << temp << endl;
break;

case 2:
temp.set(0, 0);
temp.set(1, 1);
temp.set(2, 0);
// cout << "2\t" << temp << endl;
break;

case 3:
temp.set(0, 0);
temp.set(1, 1);
temp.set(2, 1);
// cout << "3\t" << temp << endl;
break;

case 4:
temp.set(0, 1);
temp.set(1, 0);
temp.set(2, 0);
// cout << "4\t" << temp << endl;
break;
}

for(j=0; j<3; j++){
binread.bo[MAXBITCOUNT-3*(i+1) + j] = temp[j];
}
}
}

int main(){
int i;
bitset_class* bin_reads = new bitset_class[n];

for(i=0; i<n; i++)
bin_reads[i].set_bits();

for(i=0; i<n; i++){
bin_reads[i].conversion(r[i], J);
cout<<i+1<<"th read is"<<endl<<bin_reads[i].binread.bo<<endl<<endl;
}
return 0;
}


where r[i] is an array of numbers ranging from 0 to 4, J and n are the number of columns and rows in r respectively. r is taken as input.

There is a segmentation fault in the code that I cant seem to resolve.

I would appreciate it if someone can point out where I am going wrong or let me know a more efficient method to do the same

Thanks!
You can do this without any switch statements.

Here is some sample code; you'll need to extend it a bit to fit your needs.

1
2
3
4
5
6
7
unsigned char c = 3;  // 011
bitset<3> bits;

// Note: set() can take two parameters -- the second being the boolean value
bits.set( 0, c & ( 1 << 2 ) );
bits.set( 1, c & ( 1 << 1 ) );
bits.set( 2, c & ( 1 << 0 ) );

Thanks that is a much better way of going about it!
Why not just OR the value directly in?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
#include <bitset>
#include <iostream>
#include <sstream>
#include <string>
using namespace std;

typedef bitset <18> bits_t;

istream& read_quadruplets( istream& ins, bits_t& bits )
  {
  unsigned value;
  while (ins >> value)
    {
    bits <<= 3;
    bits  |= value & 0x7;
    }
  return ins;
  }

int main()
  {
  unsigned long values[ 2 ];
  bits_t        bits  [ 2 ];
  string        output[ 2 ];
  string        input [ 2 ] =
    {
    "1 4 2 0 3 2",
    "2 2 1 1 0 4"
    };

  for (unsigned n = 0; n < 2; n++)
    {
    istringstream ss( input[ n ] );
    read_quadruplets( ss, bits[ n ] );
    values[ n ] = bits[ n ].to_ulong();
    output[ n ] = bits[ n ].to_string();

    cout <<   "input:  " << input[ n ]
         << "\nvalue:  " << values[ n ]
         << "\noutput: " << output[ n ]
         << "\n\n";
    }

  return 0;
  }

The bitset is restricted in the sense that it always contains N bits. I used 18 in the examples only because both example strings use exactly 18 bits. If you use more (say, 210), then you'll have to crop the output of to_string() to use the rightmost N characters.

Hope this helps.

Hope this helps.
Thanks this was very helpful. I am still having trouble with segmentation faults though. I don't think I need to be using a class here. What is the best way to declare and store an array of bitsets because that is essentially what I was trying to do using the struct ob and class bitset_class.
I don't see how you are getting far enough to have segmentation faults. You are using variables you have not declared:
D:\prog\cc\foo> g++ a.cpp
a.cpp: In function 'int main()':
a.cpp:76: error: 'n' was not declared in this scope
a.cpp:82: error: 'r' was not declared in this scope
a.cpp:82: error: 'J' was not declared in this scope

Compilation should fail.


Segmentation faults are typically because you are trying to access memory that does not belong to you. This looks very likely with your main loops, in which you access elements of r that are not defined (not to mention bin_reads -- where n is not defined).

Hope this helps.
Sorry I havn't given those values, this is actually a function in a bigger program where n, r and J are given as inputs. Currently I am taking r as below:

1 4 2 0 3 2
2 2 1 1 0 4

where n = 2 and J = 6
Just guessing that you are walking off the end of an array somewhere.

If you want us to help, you'll need to post the complete source.
Topic archived. No new replies allowed.