Are there other encoding methods besides base64 that end in "="? - encryption

I've inherited a project where the previous developer is using an ASP object called ActiveCrypt.Crypt to encrypt the users password before sending it to the database.
The call uses the encryptvariant() function with a mode of 7, which the only documentation I can find indicates that the encrpytion is 3DES (company is now defunct). The problem is, that the value derived from the function appears to be a base64-encoded string (the trailing single and double "==" are a dead give-away).
Are there any other encodings that frequently end in "=" or "=="? Is anyone familiar with this ActiveCrypt object? I've tried 3DES encoding the password, with the key, then converting to base64, but with no luck. I've also tried inverting the key and the password in case the developer swapped the arguments. Any help would be appreciated.
Some examples using the key "key" (without quotes)
abcdefg: xiupz3RT148=
123456: iDLXPSPPjd4=
test: AWulSF10FR0=
1234567890: 8I48MAg9YWvE3y52VfMYew==

The encodings you show look like 8 and 16 bytes encoded with normal base64. Base64 encodes 3 bytes using 4 characters. DES and 3DES operate with block size of 8 bytes. So the sizes of the base64 text seem to reflect the block size. Furthermore, the output of the base 64 decoding looks fully random.
So after base64 decoding you will have 8 or 16 bytes, which you then will have to decrypt. The key is of course unknown to us, as is the block mode of operation and the padding mode. So you will have to find out those yourself. If the key is not given, it could be hard coded within the application.
Happy hunting.

Related

AES 128 decryption with ciphertext shorter than key

We are developing an application that has to work with data that is enycrpted by LoraWan (https://www.lora-alliance.org)
We have already found the documentation of how they encrypt their data, and have been reading through it for the past few days (https://www.lora-alliance.org/sites/default/files/2018-04/lorawantm_specification_-v1.1.pdf) but currently still can't solve our problem.
We have to use AES 128-bit ECB decryption with zero-padding to decrypt the messages, but the problem is it's not working because the encrypted messages we are receiving are not long enough for AES 128 so the algorithm returns a "Data is not a complete block" exception on the last line.
An example key we receive is like this: D6740C0B8417FF1295D878B130784BC5 (not a real key). It is 32 characters long, so 32 bytes, but if treat it as hexadecimal, then it becomes 16 bytes long, which is what is needed for AES 128-bit. This is the code we use to convert the Hex from String:
public static string HextoString(string InputText)
{byte[] hex= Enumerable.Range(0, InputText.Length)
.Where(x => x % 2 == 0)
.Select(x => Convert.ToByte(InputText.Substring(x, 2), 16))
.ToArray();
return System.Text.Encoding.ASCII.GetString(hex);}
(A small thing to note for the above code is that we are not sure what Encoding to use, as we could not find it in the Lora documentation and they have not told us, but depending on this small setting we could be messing up our decryption (though we have tried all possible combinations, ascii, utf8, utf7, etc))
An example message we receive is: d3 73 4c which we are assuming is also in hexadecimal. This is only 6 bytes, and 3 bytes if we convert it from hexa to normal, compared to the 16 bytes we'd need minimum to match the key length.
This is the code for Aes 128 decrypt we are using:
private static string Aes128Decrypt(string cipherText, string key){
string decrypted = null;
var cipherPlainTextBytes = HexStringToByteArray(cipherText);
//var cipherPlainTextBytes = ForcedZeroPadding(HexStringToByteArray(cipherText));
var keyBytes = HexStringToByteArray(key);
using (var aes = new AesCryptoServiceProvider())
{
aes.KeySize = 128;
aes.Key = keyBytes;
aes.Mode = CipherMode.ECB;
aes.Padding = PaddingMode.Zeros;
ICryptoTransform decryptor = aes.CreateDecryptor(aes.Key, aes.IV);
using (MemoryStream ms = new MemoryStream(cipherPlainTextBytes, 0, cipherPlainTextBytes.Length))
{
using (CryptoStream cs = new CryptoStream(ms, decryptor, CryptoStreamMode.Read))
{
using (StreamReader sr = new StreamReader(cs))
{
decrypted = sr.ReadToEnd();
}
}
}
}
return decrypted;}
So obviously this is going to return "Data is an incomplete block" at sr.ReadToEnd().
As you can see from the example, in that one commented out line, we have also tried to "Pad" the text to the correct size with a full zero byte array of correct length (16 - cipherText), in which case the algorithm runs fine, but it returns complete gibberish and not the original text.
We already have tried all of the modes of operation and messed around with padding modes as well. They are not providing us with anything but a cipherText, and a key for that text. No Initialization vector either, so we are assuming we are supposed to be generating that every time (but for ECB it isn't even needed iirc)
What's more is, they are able to encrypt-decrypt their messages just fine. What is most puzzling about this is that I have been googling this for days now and I cannot find a SINGLE example on google where the CIPHERTEXT is shorter than the key during decryption.
Obviously I have found examples where the message they are Encrypting is shorter than what is needed, but that is what padding is for on the ENCRYPTION side (right?). So that when you then receive the padded message, you can tell the algorithm what padding mode was used to make it correct length, so then it can seperate the padding from the message. But in all of those cases the recieved message during decryption is of correct length.
So the question is - what are we doing wrong? is there some way to decrypt with ciphertexts that are shorter than the key? Or are they messing up somewhere by producing ciphers that are too short?
Thanks for any help.
In AES-ECB, the only valid ciphertext shorter than 16-byte is empty. That 16-byte limit is the block (not key) size of AES, which happens to match the key size for AES-128.
Therefore, the question's
An example message we receive is: d3 73 4c
does not show an ECB encrypted message (since a comment tells that's from a JSON, that can't be bytes that happen to show as hex). And that's way too short to be a FRMPayload (per this comment) for a Join-Accept, since the spec says of the later:
1625 The message is either 16 or 32 bytes long.
Could it be that whatever that JSON message contains is not a full FRMPayload, but a fragment of a packet, encoded as hexadecimal pair with space separator? As long as it is not figured out how to build a FRMPayload, there's not point in deciphering it.
Update: If that mystery message is always 3 bytes, and if it is always the same for a given key (or available a single time per key), then per Maarten Bodewes's comment it might be a Key Check Value. The KCV is often the first 3 bytes of the encryption of the all-zero value with the key per the raw block cipher (equivalently: per ECB). Herbert Hanewinkel's javascript AES can work fully offline (which is necessary to not expose the key), and be used to manually validate an hypothesis. It tells that for the 16-byte key given in the question, a KCV would be cd15e1 (or c076fc per the variant in the next section).
Also: it is used CreateDecryptor to craft a gizmo in charge of the ECB decryption. That's likely incorrect in the context of decryption of a LoraWan payload, which requires ECB encryption for decryption of some fields:
1626 Note: AES decrypt operation in ECB mode is used to encrypt the join-accept message so that the end-device can use an AES encrypt operation to decrypt the message. This way an end device only has to implement AES encrypt but not AES decrypt.
In the context of decryption of a LoraWan packets, you want to communicate with the AES engine using byte arrays, not strings. Strings have an encoding, when LoraWan ciphertext and corresponding plaintext does not. Others seems to have managed to coerce the nice .NET do-it-all crypto API to get a low-level job done.
In the HextoString code, I vaguely get that the intention and perhaps outcome is that hex becomes the originally hex input as a byte array (fully rid of hexadecimal and other encoding sin; in which case the variable hex should be renamed to something on the tune of pure_bytes). But then I'm at loss about System.Text.Encoding.ASCII.GetString(hex). I'd be surprised if it just created a byte string from a byte array, or turned the key back to hexadecimal for later feeding to HexStringToByteArray in Aes128Decrypt. Plus this makes me fear that any byte in [0x80..0xFF] might turn to 0x3F, which is not nice for key, ciphertext, and corresponding LoraWan payload. These have no character encoding when de-hexified.
My conclusion is that if HexStringToByteArray does what its name suggests, and given the current interface of Aes128Decrypt, HextoString should simply remove whitespace (or is unneeded if HexStringToByteArray removes whitespace on the fly). But my recommendation is to change the interface to use byte arrays, not strings (see previous section).
As an aside: creating an ICryptoTransform object from its key is supposed to be performed once for multiple uses of the object.

What type of encryption do I need?

Ok, the original task is to track users among 2 "friendly" web-sites who are able to share users cookies (lets say, I have example.com and my friend has mysite.com and also he has a domain simple.example.com so he can set cookies on .example.com).
To track users activity we want to set unique cookie, this cookie should be unique and 32 bytes long (ascii). Quite simple from this point of view and can be implemented as such:
md5(microtime)
and that's it, but now we have new constraints:
we should be able to tell who exactly has set the cookie: exmaple.com engine or mysite.com engine
32 bytes longs is a must, still
we should be able to encrypt timestamp (when cookies was issued)
first and last character of the resulting cookie value should be different so we can do A/B testing basing on the cookie (so we could always say if last character of the cookie is "> K", show this users "feature A")
Given that the resulting string should always be 32 or less characters long and data should be encrypted and decrypted (not by users, of course) and the string should be unique for the users, it makes the task quite complex.
My thought and questions:
we should use symmetric key encryption (solves constraints 1 and 3), but it this case how do we ensure that resulting string is no longer than 32 chars (constraint 2)?
is there other solution on the problem given that amount of data we need to encrypt is: timestamp and microseconds (14 bytes), site-issuer flag (1 byte) = 15 bytes total
My first take was to pack data into binary string and than base64-encode it. The result would be 8-chars long base64-encoded string:
def encode():
base64( pack('Lv', timestamp, microseconds) )
Add site-issuer flag and chars at the beginning and the end:
def getCookie():
rand('a'...'Z') + encode() + issuerFlagChar() + rand('a'...'Z')
So, the result is 11 chars long and we meet constraint 2 easily.
But the problem is: this algorithm is not secure for sure, I'm not sure if the resulting string for millions of websites users is unique.
I wonder if I could use DES or AES for this purpose but I'm not sure that the resulting string will always meet constraint 2 (resulting string should be no longer than 32 ascii chars).
Is there symmetric key algorithms that ensure something like "if you encrypt N bytes with M-bytes key you will have resulting data length of Math.Ceil(N*2+1/M) bytes"? So the resulting length would be predictable?
Setting aside the fact that you should indeed consult a security consultant, the actual question you pose can easily be answered:
Is there symmetric key algorithms that ensure something like "if you encrypt N bytes with M-bytes key you will have resulting data length of Math.Ceil(N*2+1/M) bytes"? So the resulting length would be predictable?
Yes there are. And they are called Block Ciphers.
By definition, every block cipher has the property that the length of the ciphertext is equal to the length of the plain text. In practice most block ciphers (inclusing DES and AES) cheat a bit because they require the plaintext to be padded to the length of the block before they start encrypting.
In other words, given a plaintext of N bytes and a block size of B, the ciphertext will have a length of B*(Math.ceil(N/B)) bytes.
Note how I am talking about the block size, which is different from the key size. The key size is actually irrelevant in this case.
For example, AES uses a block size of 128 bits, or 16 bytes. This means that if your plain text is between 17 and 32 bytes long, AES will guarantee that your ciphertext is 32 bytes long. This is independent from the key size you choose, which can be one of 128, 192 or 256 bits (16, 24 or 32 bytes).
First of all, you need to know whether you want to encrypt or sign the data.
Encrypting will prevent users from seeing the data, but they are still able to modify it in some ways depending on the encryption type. For example, decrypting a modified ciphertext will simply give corrupted data, it won't fail.
Signing, on the other hand, will prevent users from modifying the data, that is, your code will be able to detect the data has been modified. A simple algorithm for this is HMAC.
I'll assume you want both. My solution below does both.
Your cookie must be 32 bytes long, which is 256 bits. We are going to use 128 bits for encrypted data and 128 bits for the HMAC.
For the data, I will encode the timestamp as a 64bit integer (more than enough even if you want to store it to microsecond precision). The site that issued the cookie can be stored as 1 bit if you have two sites, but I'll store it in a 32bit integer because we have plenty of space. Same for a tag you can use for a/b testing.
All the data is exactly 128 bits, 16 bytes. This is the exact size of an AES block. So, we will encrypt it with AES!
The other 16 bytes will be a MAC of the ciphertext (Encrypt then MAC). I used HMAC-SHA256, which has 256bits of output. We only have room for 128bits, so I have truncated it. In theory this makes it less secure, but in practice 128bit is big enough to make a brute-force attempt impossible.
Decrypting the cookie is the reverse process: calculate the HMAC of the given ciphertext and check it matches the given MAC. If so, proceeed to decrypt the ciphertext and unpack the data.
Here's the code:
from struct import pack, unpack
from Crypto.Cipher import AES
import hashlib
import hmac
AES_KEY = hashlib.sha256(b"secret key 1 asdfasdf").digest()
HMAC_KEY = hashlib.sha256(b"secret key 2 asdfasdf").digest()
# timestamp: 64bit unix timestamp
# site: 32bit integer, which site issued the cookie
# tag: 32bit integer, tag used for a/b testing.
def encrypt_cookie(timestamp, site, tag):
# Pack the data
data = pack('QII', timestamp, site, tag)
# Encrypt it
aes = AES.new(AES_KEY, AES.MODE_ECB, 'This is an IV456')
ciphertext = aes.encrypt(data)
# Do HMAC of the ciphertext
sig = hmac.new(HMAC_KEY, ciphertext, hashlib.sha256).digest()
sig = sig[:16] # Truncate to only first 16 bytes.
return ciphertext + sig
def decrypt_cookie(cookie):
# Do HMAC of the ciphertext
sig = hmac.new(HMAC_KEY, cookie[:16], hashlib.sha256).digest()
sig = sig[:16] # Truncate to only first 16 bytes.
# Check the HMAC is ok
if sig != cookie[16:]:
raise Exception("Cookie has been tampered with")
# Decrypt it
aes = AES.new(AES_KEY, AES.MODE_ECB, 'This is an IV456')
data = aes.decrypt(cookie[:16])
# unPack the data
timestamp, site, tag = unpack('QII', data)
return timestamp, site, tag
cookie = encrypt_cookie(1, 2, 3)
print(len(cookie)) # prints: 32
print(decrypt_cookie(cookie)) # prints: 1, 2, 3
# Change a single byte in the cookie, the last one
cookie = cookie[:31] + b'0'
print(decrypt_cookie(cookie)) # raises the exception
I'm curious to know why the cookie must be 32bytes though. Seems a weird requirement, and if you didn't have it, you'd be able to use many libraries that are designed to solve exactly this problem, such as Django signing if you're using Django.

Encrypt for SagePay forms using ColdFusion

I am trying to follow a specification for an encrypted field in SagePay 3.00 using ColdFusion 10.
The requirement is to encrypt the string as AES(block size 128-bit) in CBC mode with PKCS#5 padding using the provided password as both the key and initialisation vector and encode the result in hex.
It's the "using the provided password" that is causing the problem.
At the moment I have
myStr = 'assortednamevaluepairsetc';
providedPassword = 'abcdefghijklmnop';
myCrypt = Encrypt(myStr,providedPassword,'AES/CBC/PKCS5Padding','HEX',providedPassword,1);
but that won't work because the value I have been given by SagePay causes an error - "key specified is not a valid key for this encryption: Invalid AES key length" as its only 16 characters long
According to the CF docs you need to use generateSecretKey to guarantee the key length for AES, so I've tried this but although it gives a result, it's not the right result in terms of the encryption
myStr = 'assortednamevaluepairsetc';
providedPassword = 'abcdefghijklmnop';
mySecret = GenerateSecretKey('AES');
myCrypt = Encrypt(myStr,mySecret,'AES/CBC/PKCS5Padding','HEX',providedPassword,1);
Any help on this gratefully received.
use generateSecretKey to
guarantee the key length for AES
That function is only used when you need to generate a completely new encryption key. You already have one. The primary purpose of generateSecretKey is to ensure you generate a strong encryption key, that is sufficiently random.
won't work because the value I have been given by SagePay causes an
error - "key specified is not a valid key for this encryption: Invalid
AES key length" as its only 16 characters long
A key length of 16 bytes (ie 128 bits) is acceptable for AES. The problem is encrypt() expects the "key" to be a base64 encoded string, which is about thirty-three percent longer than a plain string. When you invoke encrypt(..), CF decodes the provided "key" string into bytes, ie Essentially doing this:
<cfset keyBytes = binaryDecode(yourPassword, "base64")>
<cfoutput>length = #arrayLen(keyBytes)# bytes</cfoutput>
Since your password string is not base64 encoded, the resulting key length is too small, ie (12) instead of (16) bytes. Hence the error message.
The solution is to base64 encode it first. How you do that depends on the encoding of the string. It sounds like it is just a plain text string (hopefully a sufficiently random one...). If so, use charsetDecode to decode the string from the relevant charset (ie utf-8, etcetera), then binaryEncode it to base64:
<cfset keyIVBytes = charsetDecode(yourKeyString, "utf-8")>
<cfset base64Key = binaryEncode(keyIVBytes, "base64")>
Also, the iv parameter should be binary. Since key and iv are one in the same, simply use the byte array from the previous step. Also, drop the iterations parameter, as it does not apply. With those changes it should work as expected:
encrypt(myStr, base64Key,"AES/CBC/PKCS5Padding", "hex", keyIVBytes)
NB: I am not an encryption expert but ... using the key as an iv is NOT a great idea... Might want to check with them to see if there are other options.

'=' symbol at the end shows encryption

I am using 128-bit Rijndael with ECB cipher mode encryption to convert my string. It formats some kind of string with symbols == or = at the end.
In my code I need some preliminary decision was this string encrypted. Can I suggest that if it contains = symbol at the end it is encrypted or possible cases when I will not get = symbol in the end in encrypted string?
First of all, if you are using ECB you are not really "encrypting" since it is a broken cipher mode (see: http://bobnalice.wordpress.com/2009/01/28/friends-don%E2%80%99t-let-friends-use-ecb-mode-encryption/ )
Secondly, the = signs you are seeing are the Base64 padding characters. They are only tagentially related to encryption since Base64 is used for any kind of binary data, not just encrypted date.
Thirdly, you can't even rely on the = sign always being present for Base64 data... it is only added for certain lengths of data (i.e. you could have encrypted, Base64 data that has no = sign)
As #JoelFan points out, those '=' characters are Base64 padding which aren't part of the encryption at all, and aren't always there either if the data comes out to the right number of characters naturally.
You need to add something "out of band". I would replace the potentially-encrypted string with a class which has a string member for the data, and an "encrypted" flag. Once you're there, throw in a getData() method that checks if the instance's data is encrypted, returns the unencrypted value of the data if so (and may cache the plaintext in a private member for later, if it'll be accessed a lot), or just returns the plain text if it's not encrypted.

invalid pixel in Firefox because of content charset setting in Netty server

I am developing an http server with Netty. On some occasions, the server must answer a 1x1 transparent pixel. So I hard-coded a GIF transparent pixel in base64, and returned it with the following code :
String pixel_string= new String (Base64.decodeBase64("R0lGODlhAQABAAAAACH5BAEKAAEALAAAAAABAAEAAAICTAEAOw=="));
HttpResponse response = new DefaultHttpResponse(HttpVersion.HTTP_1_1, HttpResponseStatus.OK);
response.setContent(ChannelBuffers.copiedBuffer(pixel_string, CharsetUtil.UTF_8));
EDIT : I also set the content-type :
response.setHeader(HttpHeaders.Names.CONTENT_TYPE,
"image/gif");
In Chrome, everything is fine. However, Firefox tells me that it cannot display the pixel (which is pretty bad for my app), as the pixel data in invalid.
After many investigations, I finally figured out a fix, by changing the charset to Iso-8859-1.
response.setContent(ChannelBuffers.copiedBuffer(
responseBuilder.pixel_string, CharsetUtil.ISO_8859_1));
I don't understand why it works, which makes me think that I may run into troubles in some cases. I tried to change the Firefox preferences (to have UTF8 as default), but it doesn't change much.
Why does Firefox accept the ISO-8859 encoding, and not UTF-8 ? Can I change that ? Would someone have a clue on the origin of the issue and how to be sure that it will work whatever the user's setting ?
Thanks
It's not Firefox that's accepting the encoding or not. It's your server.
When you do your base64 decode you produce a string that contains some characters... but what you really produced was bytes that you're then thinking of as characters somehow. Since a Java String is a container that holds a UTF-16 string, in practice what you're doing is taking each byte, treating it as a a 16-bit integer and constructing the UTF-16 "string" made up of those code units.
But when you want to put all this on the network, you have to convert you string to bytes, and the argument to copiedBuffer says how to do that. If converting to UTF-8, any character that came from a byte that had the high bit set will end up getting encoded as a two-byte UTF-8 sequence. On the other hand, if converting to ISO-8859-1, the conversion just drops the high byte of each UTF-16 code unit (which in your case is always zero anyway).
So the conversion to ISO-8859-1 produces the actual byte array you got out of base64-decoding, while the conversion to UTF-8 produces.... something else which may or may not actually make any sense depending on the exact byte values.
The copiedBuffer constructor you call is not appropriate for the type of data (binary) you are using. According to the JavaDoc of the Netty API, the one you are calling is:
Creates a new big-endian buffer whose content is the specified string
encoded in the specified charset.
Which means that your binary data is being "converted" to UTF-8 (which is meaningless). If you try to save the generated file and look at it with a hex editor, you'll probably see that it is corrupted.
Try with something like this (untested code):
static byte[] pixel_data = Base64.decodeBase64("R0lGODlhAQABAAAAACH5BAEKAAEALAAAAAABAAEAAAICTAEAOw==");
HttpResponse response = ...
response.setHeader(HttpHeaders.Names.CONTENT_TYPE, "image/gif");
response.setContent(ChannelBuffers.copiedBuffer(pixel_data));

Resources