devxlogo

June 20, 2018

Char Is Not Int

The code below expects that you can create a character from a number. It’s already bad technically: int is signed, while char is unsigned (a char is just 16 bits