devxlogo

Short Vs. Int and Number Conversion

Question:
1) What is the difference between a int and a short variable? 2) How do I convert a number to binary from decimal?

Answer:
1. Depending on the platform, some C compilers will implement an int as a short (16-bits) while other compilers will implement an int as a long (32-bits). Declaring a variable as a short assures it uses 16-bit regardless of the platform.

2. I see this question a lot. However, it really doesn’t make much sense to me as stated. This is because all information is stored in computers as binary, so how can you convert to binary? If you mean, how can you take a binary number and convert it to a string in binary format then I can tell you most compilers don’t appear to support this and it would take a small routine like this:

#include #include #include #include void main(){   char buff[21];   itoa(33, buff, 2);   cout << buff;}

Charlie has over a decade of experience in website administration and technology management. As the site admin, he oversees all technical aspects of running a high-traffic online platform, ensuring optimal performance, security, and user experience.

See also  Five Early Architecture Decisions That Quietly Get Expensive

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.