Char array to int

Can anyone point out whats going wrong with my code.

I have tried to convert an integer (-998) to a character array in hex form and want to read it back as the same integer but it is displaying some other number.

#include "stdafx.h"
#include <iostream>
#include <iomanip>
#include <stdlib.h>
#include <cstdio>


using namespace std;

int main( void )
{
int i = 0xfffffc1a; // = -998

char ch[33];

cout<< i<<endl;

cout<<hex<< i<<endl;

sprintf(ch,"0x%X",i);
int *h = (int *)ch;

cout<<" the int value is "<<*h<<endl;
cout << ch<<endl;
return 0;
}

the output is :
-998
fffffc1a
the int value is 46467830
0xFFFFFC1A
Press any key to continue . . .


int *h = (int *)ch;
what this line does is assign h the address of the first element of ch, so *h is the value of the first four bytes of ch viewed as an int. what you want to do could be done like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
#include <cstdlib>
#include <iostream>
#include <stdio.h>
using namespace std;

int main()
{
    char str_dec[30];
    char str_hex[30];
    
    int n=-998;
    sprintf(str_hex,"%x",n);
    sprintf(str_dec,"%d",n);
    
    cout << "str_hex: " << str_hex << endl;
    cout << "str_dec: " << str_dec << endl;
    
    system("pause");
    return 0;
}

or... did you want to turn the string back to an integer?... this can be done using sscanf like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
#include <cstdlib>
#include <iostream>
#include <stdio.h>
using namespace std;

int main()
{
    char str_dec[30];
    char str_hex[30];
    
    int n=-998;
    sprintf(str_hex,"%x",n);
    sprintf(str_dec,"%d",n);
    
    cout << "str_hex: " << str_hex << endl;
    cout << "str_dec: " << str_dec << endl;
    
    int m;
    sscanf(str_hex,"%x",&m);
    cout << hex << m << endl;
    sscanf(str_dec,"%d",&m);
    cout << dec << m << endl;
    
    system("pause");
    return 0;
}

Umm. what I want to do is create a char array of the interger split into highbyte and lowbyte.

I want to send the integer over a serial com port which only receives 8 bytes at a time. and by the way the device im interfacing with has a 16 bit int.
what I want to do is create a char array of the interger split into highbyte and lowbyte

why would you mess then with the int's ascii representation? :/

I work with 32 bit ints but you should be able to use this with a little modification:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
#include <cstdlib>
#include <iostream>
#include <stdio.h>
using namespace std;

int main()
{
    int number;
    unsigned char * split;
    
    cout << "enter number: ";
    cin >> number;
    
    split=(unsigned char*)&number;
    
    cout << "byte 0: " << (int)*split << endl;
    cout << "byte 1: " << (int)*(split+1) << endl;
    cout << "byte 2: " << (int)*(split+2) << endl; //<-comment this out for 16-bit ints
    cout << "byte 3: " << (int)*(split+3) << endl; //<-comment this out for 16-bit ints
    
    system("pause");
    return 0;
}


one more important thing you have to bear in mind is endianness. my machine stores the lobyte on the first position and the hibyte on the last. yours might do it the opposite way
Last edited on
Yikes! I just discovered Middle-Endianess.

http://en.wikipedia.org/wiki/Endianness
Thank you master roshi. that solved my problem. now all i have to do is access the integer in microcontroller by saying (int *)k = (int *) split. and the integer is *k.
Topic archived. No new replies allowed.