Why Float.valueOf("2.7") > 2.7
gives me true?
public static void main(String[] args) {
System.out.println(Float.valueOf("2.7") > 2.7);
System.out.println(Float.valueOf("2.7") > 2.8);
System.out.println(Float.valueOf("2.7") > 2.6);
}
prints
true
false
true.
If I do Float.valueOf("2.7") > Float.valueOf("2.7")
, it returns me false.
Can anyone helps me to understand this behavior?
The literal 2.7
is a double
- i.e. the closest double
value to 2.7.
Float.valueOf("2.7")
- or 2.7f
, equivalently, is the closest float
value to 2.7. Neither of them will be exactly 2.7 - and in this case, they're both slightly greater than 2.7. The actual values are:
float: 2.7000000476837158203125
double: 2.70000000000000017763568394002504646778106689453125
As you can see, the float
value really is greater than the double
value.
In cases where the closest value is lower than the "ideal" one, you'll see the reverse effect, where the float
value will be smaller than the double
value, because the double
value will be closer to the "ideal" one. You'll see that with 2.8, for example, where the values are:
float: 2.7999999523162841796875
double: 2.79999999999999982236431605997495353221893310546875
If you use Double.parseDouble
instead of Float.parseFloat
- or alternatively, if you make the comparisons work against the float
literals, you should get the expected result:
System.out.println(Float.valueOf("2.7") > 2.7f);
System.out.println(Double.valueOf("2.7") > 2.7);
See more on this question at Stackoverflow