I have a question about the primitive type short
in Java. I am using JDK 1.6.
If I have the following:
short a = 2;
short b = 3;
short c = a + b;
the compiler does not want to compile - it says that it "cannot convert from int to short" and suggests that I make a cast to short
, so this:
short c = (short) (a + b);
really works. But my question is why do I need to cast? The values of a and b are in the range of short
- the range of short values is {-32,768, 32767}.
I also need to cast when I want to perform the operations -, *, / (I haven't checked for others).
If I do the same for primitive type int
, I do not need to cast aa+bb to int
. The following works fine:
int aa = 2;
int bb = 3;
int cc = aa +bb;
I discovered this while designing a class where I needed to add two variables of type short, and the compiler wanted me to make a cast. If I do this with two variables of type int
, I don't need to cast.
A small remark: the same thing also happens with the primitive type byte
. So, this works:
byte a = 2;
byte b = 3;
byte c = (byte) (a + b);
but this not:
byte a = 2;
byte b = 3;
byte c = a + b;
For long
, float
, double
, and int
, there is no need to cast. Only for short
and byte
values.
No comments:
Post a Comment