Unexpected result using division with signed integers
The following code produces surprising results
{
signed int a = -9;
unsigned int b = 3;
signed int c = a / b;
signed short d = a / b;
}
Expected c = -3, d = -3
Get c = 1431655762, d = 21842
Clearly the compiler is casting a to an unsigned int before doing the division and then casting the result back.
However, regardless of the value of a or b, the result can always fit in a signed int with no overflow so there should be no problem always producing the correct answer.
In pseudo code I'd expect the following to happen:
signed int divide(signed int a, unsigned int b)
{
if b > abs(a) return 0;
return a / (signed int)b;
}
I don't know if this is a bug or a subtle trap in the C spec ready to catch the unwary.
For the record I'm using
arm-none-eabi-gcc -mcpu=cortex-m4 -mthumb -g ...
Question information
- Language:
- English Edit question
- Status:
- Solved
- Assignee:
- No assignee Edit question
- Solved by:
- Joey Ye
- Solved:
- Last query:
- Last reply: