Short Vs. Int and Number Conversion

1) What is the difference between a int and a short variable? 2) How do I convert a number to binary from decimal?

1. Depending on the platform, some C compilers will implement an int as a short (16-bits) while other compilers will implement an int as a long (32-bits). Declaring a variable as a short assures it uses 16-bit regardless of the platform.

2. I see this question a lot. However, it really doesn’t make much sense to me as stated. This is because all information is stored in computers as binary, so how can you convert to binary? If you mean, how can you take a binary number and convert it to a string in binary format then I can tell you most compilers don’t appear to support this and it would take a small routine like this:

#include #include #include #include void main(){   char buff[21];   itoa(33, buff, 2);   cout << buff;}

Share the Post:
Share on facebook
Share on twitter
Share on linkedin


Recent Articles: