I start the program through the following command line:
Test.exe < test.dat
with test.dat file containing (I put in hex format to show all characters):
1\x0D\x0A2\x0A3
Program ouputs is:
Car 1: 31
Car 2: a => character 2 is missing: Car 2 : d
Car 3: 32
Car 4: a
Car 5: 33
Can someone tell me how to get the character \x0D in my buffer when reading standard input ?
I saw the ios_base::binary mode when opening, but how to open standard input ?
You might have mis-interpretted the content of the file.
but your answer shows that you get the x0D character while I don't. Do you also use Visual Studio 5 ?
hexadecimal binary content of test.dat is exactly the following:
31 0D 0A 32 0A 33
The output is:
>TestUtil <..\test.dat
Car 0: 31
Car 1: a
Car 2: 32
Car 3: a
Car 4: 33
while I was waiting:
>TestUtil <..\test.dat
Car 0: 31
Car 1: d
Car 2: a
Car 3: 32
Car 4: a
Car 5: 33
Pb is solved by adding the following line before first use of cin:
if (_setmode(_fileno(stdin), _O_BINARY) == -1)
cout << "ERROR: while converting cin to binary:" << strerror(errno) << endl;