Char Is Not Int June 20, 2018 The code below expects that you can create a character from a number. It’s already bad technically: int is signed, while char is unsigned (a char is just 16 bits