Consider the following BitArray:
BitArray bitArray = new BitArray(new Boolean[] {true, true, false, false});
and in binary equals:
1100
Now I want to convert this to an int and have tried to use the methods described on this page: How can I convert BitArray to single int?
However, both these methods converts 1100
to 3
instead of 12
. So it seems as if it ignores the last two bits and considers it of size 2 bit, for which of course the answer is 3
.
One of the methods on the linked page above, in action:
int[] array = new int[1];
bitArray.CopyTo(array, 0);
After executing the above, bitArray
has the value 3.
How can I express in the code that I want it to consider all 4 bits?
The constructor for BitArray(bool[])
accepts the values in index order - and then CopyTo
uses them in the traditional significance (so bitArray[0]
is the least significant bit) - so your true
, true
, false
, false
ends up meaning 0011 in binary, not 1100.
It's not ignoring the last two bits - it's just treating your initial array in the opposite order to the one you expected.
If you want it to make the first-specified value as the most significant value when converting the bits to integers, you'll need to reverse your input array.
See more on this question at Stackoverflow