I’m using the sizeof() operator to find the length of my integers just like the Q&A section said, but it consistently returns a 4 regardless of integer length. What could be causing an error like this?
I’m more puzzled trying to determine exactly what it was you were expecting sizeof() to do.
sizeof() returns the number of bytes a variable, or variable type, occupies. Your particular compiler apparently implements long integers, which each take up 4 bytes.
This is normal behavior.