I'm trying to decrypt passwords from membership framework in a node.js app.
They are able to be decrypted as I get get the plaintext version from the membership framework User.GetPassword().
I've tried the following, but this doesn't work:
let encryptedPassword = 'LqOz9My...';
let passwordSalt = 'JQ2...';
let validationKey = '0123456789ABCEF';
let decryptionKey = '0123456789ABCEF';
var algorithm = 'aes128';
var decipher = crypto.createDecipher(algorithm, decryptionKey);
var decryptedPassword = decipher.update(encryptedPassword, 'utf8', 'utf8') + decipher.final('utf8');
Here's the solution:
let salt = '<my base64 salt>';
let saltBuffer = Buffer.from(salt, 'base64'); // 16 bytes
let decryptionKey = '<my hex decryption key>';
let decryptionKeyBuffer = Buffer.from(decryptionKey, 'hex'); // 24 bytes
var algorithm = 'aes192';
let encryptedPassword = '<my base64 encoded, encrypted string>';
var decipher = crypto.createDecipheriv(algorithm, decryptionKeyBuffer, saltBuffer);
decipher.setAutoPadding(false);
var dec = decipher.update(encryptedPassword, 'base64', 'utf8');
dec += decipher.final('utf8');
console.log(dec);
The problem was that I was using the wrong encoding and wrong encryption. Here's how I figured it out.
Encryption
You can easily determine the encryption algorithm type via the length of your decryption key. In my case, once I converted the key to a Buffer, the length was 24, and because 24 * 8 = 192, my algo was aes192;
Encoding
In my original example, I had the encoding of the encrypted password as utf8. The encoding actually was base64. Not sure how to determine this other than trying the various accepted params.
Related
I have been trying to decode the octet string as per steps mentioned in
https://developer.apple.com/documentation/devicecheck/validating_apps_that_connect_to_your_server?language=objc
Here is what I have tried:
X509Certificate cert1 = getParentCertificate(new String(decodedCredCert));
System.out.println(cert1);
cert1.checkValidity(); // verify against apple app attest root ca
byte[] ext = cert1.getExtensionValue("1.2.840.113635.100.8.2");
ASN1InputStream bIn = new ASN1InputStream(ext);
ASN1Primitive obj = bIn.readObject();
ASN1OctetString string = (ASN1OctetString) obj;
byte[] octs = string.getOctets();
ASN1InputStream dIn = new ASN1InputStream(octs);
String octetString = ASN1Dump.dumpAsString(dIn.readObject());
I got the output as: "[[1]#8333585e692916d8cbcdce3c6aa2bd71617d54fed758957cfd6b50a2093fd506]"
For Ios AppAttestation, follow as below to get the extension value and it's corresponding octet string. As mentioned in that page,
Obtain the value of the credCert extension with OID
1.2.840.113635.100.8.2, which is a DER-encoded ASN.1 sequence. Decode the sequence and extract the single octet string that it contains.
Here is the sample code:
byte[] oidValue = credCert.getExtensionValue(ooid);
DEROctetString envelope = (DEROctetString) new ASN1InputStream(oidValue).readObject();
DLSequence sequence = (DLSequence) new ASN1InputStream(envelope.getOctetStream()).readObject();
DLTaggedObject taggedObject = (DLTaggedObject) sequence.getObjectAt(0);
DEROctetString taggedObjectOctet = (DEROctetString) taggedObject.getObject();
log.debug("Octet String : {}", taggedObjectOctet.getOctets());
"Octet string" is just a spec phrase that modern languages call "byte array". You've extracted the value as of octs, and should compare that value to whatever nonce you're supposed to compare it against.
We´re struggeling with the decryption of DESFire Data. We´ve authenticated successfully and could decrypt RndA´ which was the same as the RndA we´ve created.
Now we try to read an encyphered File from position 0 for 16 bytes.
From some java library we could figure out, that we have to use the encyphered Command as IV for the decryption. Is that right?
Here the examples:
SessionKey: 0ba1caf83a26a72170149b7504895f34
ReadCommand: bd00000000100000
Crc32C for Cmd: D9CEE76B
Secret: 6a0d0f0d5c8f054b1e5914a42e49728622774c6272e5c34a69ed302251576aaf
So now we concat the ReadCommand with Crc32C:
bd00000000100000D9CEE76B
Then we padd Zeros up to 16 Bytes
bd00000000100000D9CEE76B00000000
Then we generate e(cmd + crc + padding) with session key and IV 0 to gain the next iv for decrypting the resposne:
77E24803B5401C61F657607923E5A318
Now we decrypt the secret with session Key and IV:=e(cmd + crc32) and are getting:
D1E9A4726C2A5C3FD5938E714C07524EF1F74BD9000000000000000000000000
There are many Zeros which let me think we are not far away from the answer.
So please someone tell us, what is wrong?
We are using this library for Crc32C
And here the full code we are using within a test:
[Theory]
[InlineData("6a0d0f0d5c8f054b1e5914a42e49728622774c6272e5c34a69ed302251576aaf", "0ba1caf83a26a72170149b7504895f34", "bd00000000100000")]
public void DecryptData_Test(string secretS, string sessionKeyS, string cmdS)
{
var cryptoAlgoFactory = new Func<SymmetricAlgorithm>(() => Aes.Create());
var keyLength = 16;
var secret = EncriptionHelper.StringToByte(secretS);
var sessionKey = EncriptionHelper.StringToByte(sessionKeyS);
var cmd = EncriptionHelper.StringToByte(cmdS);
var crytoAlgo = cryptoAlgoFactory();
crytoAlgo.Mode = CipherMode.CBC;
crytoAlgo.Padding = PaddingMode.None;
var encryptor = crytoAlgo.CreateEncryptor(sessionKey, new byte[keyLength]);
var crc32 = BitConverter.GetBytes(Crc32C.Crc32CAlgorithm.Compute(cmd));
var padding = 0;
if ((cmd.Length + crc32.Length) % keyLength != 0)
padding = keyLength - ((cmd.Length + crc32.Length) % keyLength);
var result = new byte[cmd.Length + crc32.Length + padding];
Array.Copy(cmd, result, cmd.Length);
Array.Copy(crc32, 0, result, cmd.Length, crc32.Length);
var iv = encryptor.TransformFinalBlock(result, 0, result.Length);
crytoAlgo = cryptoAlgoFactory();
crytoAlgo.Mode = CipherMode.CBC;
crytoAlgo.Padding = PaddingMode.None;
var decryptor = crytoAlgo.CreateDecryptor(sessionKey, iv);
var plain = decryptor.TransformFinalBlock(secret, 0, secret.Length);
Assert.NotNull(plain);
}
We´ve found out, that we have to encrypt the command with CMACing and without CRC32C.
Therefore the following solution:
SessionKey: 0ba1caf83a26a72170149b7504895f34
ReadCommand: bd00000000100000
Secret: 6a0d0f0d5c8f054b1e5914a42e49728622774c6272e5c34a69ed302251576aaf
First Encrypt the cmd with CMACing to get the following IV for further decrypting (Note to use the SessionKey):
A70BEC41E95A706F11F7DA3D59F2F256
Then decrypt the secret in CBC Mode with IV cmac(cmd) to get the result:
01000030303030313233343536100300F1F74BD9000000000000000000000000
Within the CMACing there is still something wrong so we used an NuGet Packege which worked fine.
I want to encrypt an exe file (file.exe), write the encrypted version to a text file (fileenc.txt) and decrypt the data in the text file back to another exe file (filedec.exe).
file.exe and filedec.exe are the same and are expected to function the same way.
However, when I try to do this the filedec.exe does not work. Error Popup says: "This app cannot run on your PC".
Please what could be the problem?
However, when I just read the file.exe, write to fileenc.txt without encryption or decryption, and then read fileenc.txt and write data to filedec.exe without encryption or decryption, filedec.exe seems to work fine.
Also, when I try encrypting and decrypting a text file with this code, it works fine too.
But when I encrypt and decrypt an exe on the fly, filedec.exe doesn't work.
Please help me out. Thank you everyone.
Here is my full code:
Main();
function Main() {
var arrKey;
arrKey = "encryptionkey";
//Encrypt file.exe and write the encrypted form to file.txt
Crypt( "C:\\...\\file.exe", "C:\\...\\fileenc.txt", arrKey );
//Decrypt the previously encrypted file.txt and write the decrypted form to filedec.exe
Crypt( "C:\\...\\fileenc.txt", "C:\\...\\filedec.exe", arrKey );
//NOTE: file.exe and filedec.exe are expected to work fine when executed
}
function Crypt(fileIn, fileOut, key) {
var fileInRead;
//Read fileIn
var adTypeBinaryRead = 1;
var BinaryStreamRead;
BinaryStreamRead = new ActiveXObject("ADODB.Stream");
BinaryStreamRead.Type = adTypeBinaryRead;
BinaryStreamRead.Open();
BinaryStreamRead.LoadFromFile(fileIn);
fileInRead = BinaryStreamRead.Read();
//Convert fileIn binary data to string
var objRS = new ActiveXObject("ADODB.Recordset");
var DefinedSize = 1024;
var adSaveCreateOverWrite = 2;
var adFldLong = 0x80;
var adVarChar = 201;
var adTypeText = 2;
objRS.Fields.Append("filedata", adVarChar, DefinedSize, adFldLong);
objRS.Open();
objRS.AddNew();
objRS.Fields("filedata").AppendChunk(fileInRead);
var binString = objRS("filedata").value;
objRS.close();
//Make key as long as string version of fileIn
while (key.length < binString.length) {
key += key;
}
key = key;
//crypt converted string with key
var k, ss, q;
var cryptresult = "";
i = 0;
for (var index = 0; index < binString.length; index++) {
k = key.substr(i, 1);
q = binString.substr(i, 1);
ss = q.charCodeAt(0);
cryptresult = cryptresult + String.fromCharCode(q.charCodeAt(0) ^ k.charCodeAt(0));
i = i +1;
}
// write crypted string to file
var outStreamW = new ActiveXObject("ADODB.Stream");
outStreamW.Type = adTypeText;
// Charset: the default value seems to be `UTF-16` (BOM `0xFFFE` for text files)
outStreamW.Open();
outStreamW.WriteText(cryptresult);
outStreamW.Position = 0;
var outStreamA = new ActiveXObject("ADODB.Stream");
outStreamA.Type = adTypeText;
outStreamA.Charset = "windows-1252"; // important, see `cdoCharset Module Constants`
outStreamA.Open();
outStreamW.CopyTo(outStreamA); // convert encoding
outStreamA.SaveToFile(fileOut, adSaveCreateOverWrite);
outStreamW.Close();
outStreamA.Close();
}
EDIT:
More troubleshooting into my code shows that when I encrypt and decrypt file.exe ON THE FLY, and then write the decrypted data to fileenc.exe, fileenc.exe works well.
But when I encrypt file.exe and write the encrypted data to fileenc.txt and then read the fileenc.txt, decrypt the read encrypted data and write to fileenc.exe (just like in my code), fileenc.exe gets corrupted. My understanding suggests that the manner through which I write the encrypted data to fileenc.txt could be the problem here.
Please I need help, how do I go about with this.
I'm using Pkcs11Interop Library and trying to test encryption and decryption with RSA_PKCS_OAEP mechanism.
CK_RSA_PKCS_OAEP_PARAMS p = new CK_RSA_PKCS_OAEP_PARAMS();
p.HashAlg = (uint)CKM.CKM_SHA_1;
p.Mgf = (uint)CKG.CKG_MGF1_SHA1;
p.Source = (uint)CKZ.CKZ_DATA_SPECIFIED;
p.SourceData = IntPtr.Zero;
p.SourceDataLen = 0;
CK_MECHANISM mech = CkmUtils.CreateMechanism(CKM.CKM_RSA_PKCS_OAEP, p);
Everything is OK with the above mechanism but if I change the hash algorithm to SHA-256 like below:
CK_RSA_PKCS_OAEP_PARAMS p = new CK_RSA_PKCS_OAEP_PARAMS();
p.HashAlg = (uint)CKM.CKM_SHA256;
p.Mgf = (uint)CKG.CKG_MGF1_SHA256;
p.Source = (uint)CKZ.CKZ_DATA_SPECIFIED;
p.SourceData = IntPtr.Zero;
p.SourceDataLen = 0;
CK_MECHANISM mech = CkmUtils.CreateMechanism(CKM.CKM_RSA_PKCS_OAEP, p);
Then I get CKR_ARGUMENTS_BAD exception. I have been searching and debugging for a while but found nothing.
I had the same problem with Luna HSM (but was given CKR_MECHANISM_PARAM_INVALID).
That version of HSM simply did not support OAEP with SHA-256 and firmware upgrade was needed. After firmware upgrade it worked without any problems. Check if your device supports this variant.
Your code seems ok, I used (in java):
CK_RSA_PKCS_OAEP_PARAMS mechanismParams = new CK_RSA_PKCS_OAEP_PARAMS(
CKM.SHA_1,
CKG.MGF1_SHA1,
new CK_RSA_PKCS_OAEP_SOURCE_TYPE(CKZ.DATA_SPECIFIED.longValue())
, null, 0
);
and
CK_RSA_PKCS_OAEP_PARAMS mechanismParams = new CK_RSA_PKCS_OAEP_PARAMS(
CKM.SHA256,
CKG.MGF1_SHA256,
new CK_RSA_PKCS_OAEP_SOURCE_TYPE(CKZ.DATA_SPECIFIED.longValue())
, null, 0
);
Good luck!
I have the following Go code
ciphertext := "Zff9c+F3gZu/lsARvPhpMau50KUkMAie4j8MYfb12HMWhkLqZreTk8RPbtRB7RDG3QFw7Y0FXJsCq/EBEAz//XoeSZmqZXoyq2Cx8ZV+/Rw="
decodedText, _ := base64.StdEncoding.DecodeString(ciphertext)
decodedIv, _ := base64.StdEncoding.DecodeString("u9CV7oR2w+IIk8R0hppxaw==")
newCipher, _ := aes.NewCipher([]byte("~NB8CcOL#J!H?|Yr"))
cfbdec := cipher.NewCBCDecrypter(newCipher, decodedIv)
cfbdec.CryptBlocks(decodedText, decodedText)
data, _ := base64.StdEncoding.DecodeString(string(decodedText))
println(string(data))
The output is {"data":{"value":300}, "SEQN":700 , "msg":"IT WORKS!!"
It's encrypted with the following CryptoJS
function encrypt(message, key) {
let keyHex = CryptoJS.enc.Hex.parse(parseToHex(key))
let iv = CryptoJS.lib.WordArray.random(128 / 8);
let wordArray = CryptoJS.enc.Utf8.parse(message);
let base64 = CryptoJS.enc.Base64.stringify(wordArray);
let encrypted = CryptoJS.AES.encrypt(base64, keyHex, { iv: iv });
return {
cipher: encrypted.ciphertext.toString(CryptoJS.enc.Base64),
iv: CryptoJS.enc.Base64.stringify(iv),
length: base64.length,
size: encrypted.ciphertext.sigBytes,
}
}
And can be decrypted with
function decrypt(message, key, iv) {
let ivEX = CryptoJS.enc.Hex.parse(decodeToHex(iv));
let keyEX = CryptoJS.enc.Hex.parse(parseToHex(key));
let bytes = CryptoJS.AES.decrypt(message, keyEX , { iv: ivEX});
let plaintext = bytes.toString(CryptoJS.enc.Base64);
return decodeToString(decodeToString(plaintext));
}
The output is {"data":{"value":300}, "SEQN":700 , "msg":"IT WORKS!!" } - this is the correct output
Why Go has different output?
Check your errors please. ALWAYS
illegal base64 data at input byte 75
https://play.golang.org/p/dRLIT51u4I
More specifically, the value at byte 75 is 5, which is out of the range of characters available to base64. In ascii, it is the ENQ (enquiry) character. As to why this ends up in your final base64 string is beyond me.
EDIT: OK found the issue. For whatever reason, the base64 padding character = at the end is being decrypted as 5 consecutive bytes containing the value 5. Here is a playground link that shows it fixed. https://play.golang.org/p/tf3OZ9XG1M
EDIT: As per matt's comments. I updated the fix function to simply remove all the PKCS7 block padding and use RawStdEncoding for the last base64 decode. This should now be a reasonable fix.