I have a byte array that was encrypted using AES with the pass phrase encrypted using SHA-256. Everything works perfect, but I'm wondering about the last part where I have to encode the byte array that I get as a result of the encryption. Does it matter, for the robustness of the end result how the byte array is encrypted, Base64, conversion to hexadecimal values, something else?
Logically speaking, it doesn't matter since there really aren't that much encoding methods and most of the time the most obvious one, Base64, is used. But since I'm not that well versed with cryptography I just want to make sure.
Take a byte array as an example (random array of bytes as an example):
[0] 182
[1] 238
[2] 54
[3] 24
[4] 69
[5] 224
[6] 105
[7] 13
[8] 5
[9] 52
[10]112
[11]71
[12]250
[13]163
[14]234
[15]234
This gives a possible result in Base64 (random result, does not match above):
ou+yUEkilfrGIF3HBH08vu8A==
Using BitConvertor to transform it to hexadecimal values gives (random result, does not match above):
A2EBCA1945E8BC920532F068D27BAEF1
It's simple to convert the above results back to the respective byte array and only then does the hard part start.
Does it matter, for the robustness of the end result how the byte array is encrypted, Base64, conversion to hexadecimal values, something else?
No, not at all. So long as you're encoding it in a lossless format (which both base64 and hex are) that's fine. Don't use something like Encoding.ASCII.GetString(...)
- that would be lossy and inappropriate. (Don't use Encoding
at all for this task.)
Just ask yourself whether you could reverse your encoding and get back to the original bytes - if so, you're fine. (And that's true for hex and base64, assuming it's properly implemented.)
See more on this question at Stackoverflow