Java : jre1.8.0_45
I finished debugging my Elliptic Curve class and in logging all characteristics of the keys I also log the BIT length of the keys (which with Elliptic Curve is not always an even number of bits).
I display the bit length of the Keys via a BigInteger:
ECPrivateKey oPK=generate the key ...
BigInteger oBI=oPK.getS();
MetaLogBook.debug("Key Size in Bits :"+oBI.bitLength()+
"\nRaw Key Hex : 0x"+oBI.toString(16)).toUpperCase()+"\n"+);
While the key is always correctly represented and ciphering works fine, the bit size however fluctuates. Many times it is correct (571) but from time to time it is a few bits off.
So I started to generate AES-256 keys and I noticed that they are often 256 bits but also from time to time a few bits off. So it had nothing to do with the odd number of bits in Elliptic Curve.
After a long search I THINK I found an explanation, or at least a beginning of an explanation, but I don't know if this is INTENDED or a Java Bug.
When I saw that this 64 bytes hex (32 byte AES 256 bit key) had a bit length of 254 displayed in stead of 256 I could draw a conclusion.
0x27006F59EA138FE01FBE1F554253DBDD84D73719E77088907357C6FA6B60F170
The last nibble is 0, so if trailing 0-bits would not be counted then I would have been at least 4 bits short in the bitLength() and I was only 2 bits short.
Then it occurred to me that the first nibble of my key started with a 2 or binary 0010. So I figured that the BigInteger.bitLength() didn't count the LEADING zero BITS. I repeated this a number of times and the behaviour seems consistent (normally I think everyone can reproduce this).
I would like to know if this is the WANTED behaviour of BigInteger.bitLength() or if this could possibly be a bug. I wouldn't have asked the question if it weren't that I assume that many crypto code in Java may rely on BigInteger (including providers) and I cannot imagine they wouldn't have ran into this problem.
TIA
It's working as intended and as documented.
From the documentation:
Returns the number of bits in the minimal two's-complement representation of this BigInteger, excluding a sign bit. For positive BigIntegers, this is equivalent to the number of bits in the ordinary binary representation.
Note the "minimal" part here - for example the decimal value 5 can be represented as 00000000000000000000000000101 or 101... but 101 is the minimal representation, so the bit length is 3.
See more on this question at Stackoverflow