Decimal to Binary

Pages: 12
Hello there.
I'm trying to write a converter (DECimal to BINary).
The problem is that the BINary number 'cout's' incorrectly.
It has to be reverse. How do i reverse it?

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
#include<iostream>
using namespace std;
int main()
{
	int dec,bin;
	cout<<"Write a positive decimal number: ";
	cin>>dec;
	while(dec>0)
	{
	bin=dec%2;
	dec=dec/2;
	cout<<bin;
	}
	cout<<endl;
	return 0;
}
Last edited on
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
#include <iostream>
#include <string>
#include <limits>

std::string as_binary( unsigned num )
{
	unsigned mask = 1 << std::numeric_limits<unsigned>::digits-1 ; //*edit

	// skip leading 0s
	while ( (mask & num) == 0 )
		mask >>= 1 ;

	std::string result ;
	while ( mask )
	{
		result.push_back( (mask&num ? '1' : '0') ) ;
		mask >>= 1 ;
	}

	return result ;
}

int main()
{
	unsigned dec;
	std::cout<<"Write a positive decimal number: ";
	std::cin>>dec;
	std::cout << as_binary(dec) << '\n' ;
	return 0;
}


is one way to do it.
Last edited on
closed account (4z0M4iN6)
The other way would be, shift num instead mask - mask isn't needed as a variable - and take always the first bit. But seems, yoiu don't want leading 0s - if you don't want , then you have to do a preshifting before.

Maybe you should think about my idea. The other way, you can't write any more, because cire already had written.

And also a hint: you could think about 0x80000000 and <<=1
Last edited on
closed account (zb0S216C)
Here's my version:

1
2
3
4
int Number(2134);

for(int I(0); I < (sizeof(Number) * 8); ++I)
    std::cout << ((Number &(1 << I)) ? '1' : '0');

Works in both Visual C++ '10 & GCC 4.4.1

Wazzak
Last edited on
 
Last edited on
closed account (zb0S216C)
atrium wrote:
"it prints the digits in reverse order"

It depends on the endianness of your system.

Wazzak
 
Last edited on
It depends on the endianness of your system


No, it doesn't. =)
I tried this and it works quite fine,you might wanna improve its efficiency
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
#include<iostream>
#include<string.h>
char bin(int dec)
{
    if ((dec%2)==0)
    return '0';
    else
    return '1';
}
using namespace std;
int main()
{
int dec,a(0);
cout<<"Enter value in decimal: ";
cin>>dec;
const int SIZE=10;
char my_bin[SIZE]={};
while (dec>=0)
{
    if (dec==0)
    break;
    my_bin[a]=bin(dec);
    dec/=2;
    a++;
}
cout<<"The binary code for "<<dec<<" is: ";
for(int i=0;i<=a;i++)
{
        if(my_bin[i]=='\0')
        {cout<<strrev(my_bin)<<endl;}}
return 0;
}
closed account (4z0M4iN6)
And I thought, it was Agonche, who wanted to write such a converter. And Agonche, the example of cire is good, the others are not. And you shouldn't look at these, if you want to become a good programmer.

And the best would be, do it yourself, because I think, you can do it better - except maybe cire.

And if the others would like to learn something, could be, they should look at your code, after you have finished it.
Last edited on
 
Last edited on
closed account (zb0S216C)
dadabe wrote:
"And Agonche, the example of cire is good, the others are not. And you shouldn't look at these, if you want to become a good programmer."

...says the master C++ programmer who's been programming in C++ for less than a year. iHutch was right; you are patronising. atrium has a good attitude; one you should consider following. Code aesthetics isn't everything.

cire wrote:
"No, it doesn't. =)"

Evidence to support that remark? :)

Wazzak
Last edited on
closed account (4z0M4iN6)
Yes atrium, should learn from examples, but good examples, like the one of cire. Could be, that the reason of such ugly or clunky code in the real word is, because the programmers did learn from the wrong examples.

And in the real word, we often have code, it works, but when someone looks at it, he don't see, what it means and why this could work. And if there is somewhere a bug, people don't find it, make a work around. The work around don't work, nobody knows why. So make a work around for the work around. Maybe some sleeps would help.

Yes we have much of such code. And this need not be. People think, if we make it not so perfect, this will save some days. Yes maybe now, but cost't months or years after.

Ok, nobody needs to concentrate too much, if he begins the implementation. But the last 2 steps should be: perfect code and commentation. And this don't cost any time, because this saves the time for looking after the bugs, which would have been detected, if somebody would have made the code clear.

And about examples, it's true, that we don't need to begin from scratch, now we have libraries. But some examples are examples, form which people should learn something, for example a bubble search or a select search. You want to do it, others do it. You look their listing, you think yes, that should it be. Next day or some days later you can't remember the example any more. Whoud you have thought about it yourself, without any help, it would have been your own thoughts and your own invention. If 20 years later somebody asks you about such a sort, you simply can write and show him the lines, these lines, which are your thoughts, you can always remember.

And when you have much of such own ideas in your brains, you always can do, what others forgot or can't find in anymore books. And such ideas are not only ideas to remember, you learned also to create such ideas new.

And another reason, you should make it your own, because you also don't ask someboty, how to post some lines of text in this forum.

And my opinion: implimentation of some lines of code is easier as writing some lines of text. You just write it. And if you know some principles, you can just write it.
Last edited on
closed account (zb0S216C)
dadabe wrote:
"Yes atrium, should learn from examples, but good examples, like the one of cire."

What's wrong with mine? Matri X's? Come on, Mr. C++ Master, enlighten us all with your thorough expertise.

dadabe wrote:
"Could be, that the reason of such ugly or clunky code in the real word is, because the programmers did learn from the wrong examples."

You've just managed to generalise every programmer that writes "clunky-looking" code to be a bad programmer. Oh, yeah, I forgot you're the master programmer around here. Sorry.

dadabe wrote:
"And in the real word, we often have code, it works, but when someone looks at it, he don't see, what it means and why this could work."

That's the whole point in comments.

P.S: The rest of your post is ramble. And for love of crumb cake, ease-up on the commas.

Wazzak
Last edited on
 
Last edited on
closed account (4z0M4iN6)
atrium wrote:

What I find most interesting about programming is not so much what is good or bad practice, but the mere fact that there are hundreds of different ways to write code to achieve the same result. It can be informative to consider other possibilities.


Yes interesting is much and to be a collector is a nice hobby. Someone may have 500 hammers. Which he should take for hammering a nail in the wall - and can't decide this and reads much about this in books and visit forums? Craftsmen maybe would only have one - the best, and know, what to do!
Last edited on
@dadabe
You seem to have a lot to say about the way others code, yet the only code I've seen you post was that BS copy/move ctor tests. Then when asked to post code that you brought up you give a BS reason why you cant. Show some code or SFTU.
Last edited on
closed account (4z0M4iN6)
atrium wrote:

Particularly if code has to be re-implemented on a different platform or in a different language, different approaches can be very useful.


This, a think, is a very good idea. And I think, also Agonche could find it very interesting.

atrium wrote:

start again from scratch, in the real world, that won't always be possible


You are right about this. And we also wouldn't like to invent the wheel new. We can simply use, what our compiler and the libraries offer.

But we also should know, what simple expressions mean on different systems, and about people normally wouldn't think about, because they simply use, what they have, without any questioning. And what is clumsy or not, depends much, of the system, whe use. For example:

a << 1

That is never clumsy, each micro processor can do this.

But what about:

a << n

Micro processors, as we have in our PC can do this. Maybe << 8 needs more time, as << 1. This difference could be important, if we need the highest speed, whe can get.

But what about a << n, if the micro processor don't understand a << n, but only a << 1?

Then we should think of a function ShiftLeft:

1
2
3
4
5
unsigned int ShiftLeft( unsigned int a, unsigned char n)
{
     for( int i = 0; i < n ; ++i)  a  <<= 1;
     return a;
}

This code would not perform so very well.

And if we only would have a system with 8 bit data?

First lets try a function for only a << 1

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
unsigned int ShiftLeft1( unsigned int a)
{
     unsigned char * input = &a;

     unsigned int ReturnValue;
     unsigned char * output = &ReturnValue;

     // ========= Your Code ============
     // you may use: << 1 , unsigned char
     // you may not use << n, other data types except uchar
     // you may not use / (Division), * (Multiplikation), % (modulo)


    // ==============================

     return ReturnValue;
}


The we try to implement the following function:


1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
unsigned int ShiftLeft( unsigned int a, unsigned char n)
{
     unsigned char * input = &a;

     unsigned int ReturnValue;
     unsigned char * output = &ReturnValue;

     // ========= Your Code ============
     // you may use: << 1 , unsigned char
     // you may not use << n, other data types except unsigned char
     // you may not use / (Division), * (Multiplikation), % (modulo)


    // ==============================

     return ReturnValue;
}


And because I think of this difference, I always use a << 1, when I don't need a << n

And why such functions? We don't need them, because the compiler translate it, but we also should think about what the compiler does.

This topic is a very interesting one. But most programmers seem to know nearly nothing about it.

And you need knowledge about bits and bytes especially for hardware near programming.
Last edited on
closed account (zb0S216C)
dadabe wrote:
"And we also wouldn't like to invent the wheel new. We can simply use, what our compiler and the libraries offer."

The STL containers are optimised for general, all-round use. Sometimes, you have to reinvent the wheel to conform with particular requirements.

dadabe wrote:
"but we also should think about what the compiler does."

And when it disagrees with you, it's marked as a buggy compiler, then you go off with your so called investigations which are completely false. Consider learning the C++ language and reading the compiler's manual before challenging the language, standard, and compiler.

dadabe wrote:
"But most programmers seem to know nearly nothing about it."

...and neither do you, by the looks of it.

I asked a question in regards to why some examples were incorrect; you discard the question and begin to ramble on about microprocessors. I can say that your explanation of bit-shifting is far from thorough. Not to mention the examples are terrible.

Wazzak
Last edited on
Pages: 12