Problems using CNG and BCRYPT_KDF_SP80056A_CONCAT KDF - encryption

I am in the processing of implementing a CNG ECDH and then I am trying to use the BCRYPT_KDF_SP80056A_CONCAT KDF to derive a symmetric AES256 key (BCryptDeriveKey()). I am having a problem (i always get back 0xc000000d status returned.)
i have generated a shared secret successfully and I have created the buffer desc "BCryptBufferDesc" which has an array of "BCryptBuffer" with 1 AlgorithmID, 1 PartyU and 1 PartyV "other info". I think I have the structures all defined and populated properly. I am just picking some "values" for PartyU and PartyV bytes (i tried 1 byte and 16 bytes for each but i get the same result). NIST documentation gives no details about what the other info should be..
i have followed the Microsoft web site for creating these structures, using their strings, defines, etc. I tried with the standard L"HASH" kdf and it works and i get the same derived key on both "sides", but with the concatenation KDF i always get the same 0xC000000D status back..
Has anybody else been able to successfully use BCRYPT_KDF_SP80056A_CONCAT CNG KDF? If you did, do you have any hints?

This worked for me:
ULONG derivedKeySize = 32;
BCryptBufferDesc params;
params.ulVersion = BCRYPTBUFFER_VERSION;
params.cBuffers = 3;
params.pBuffers = new BCryptBuffer[params.cBuffers];
params.pBuffers[0].cbBuffer = 0;
params.pBuffers[0].BufferType = KDF_ALGORITHMID;
params.pBuffers[0].pvBuffer = new byte[0];
params.pBuffers[1].cbBuffer = 0;
params.pBuffers[1].BufferType = KDF_PARTYUINFO;
params.pBuffers[1].pvBuffer = new byte[0];
params.pBuffers[2].cbBuffer = 0;
params.pBuffers[2].BufferType = KDF_PARTYVINFO;
params.pBuffers[2].pvBuffer = new byte[0];
NTSTATUS rv = BCryptDeriveKey(secretHandle, L"SP800_56A_CONCAT", &params, NULL, 0, &derivedKeySize, 0);
if (rv != 0){/*fail*/}
UCHAR derivedKey = new UCHAR[derivedKeySize];
rv = BCryptDeriveKey(secretHandle, L"SP800_56A_CONCAT", &params, derivedKey, derivedKeySize, &derivedKeySize, 0);
if (rv != 0){/*fail*/}

Related

Decrypt DESFire ReadData Session

We´re struggeling with the decryption of DESFire Data. We´ve authenticated successfully and could decrypt RndA´ which was the same as the RndA we´ve created.
Now we try to read an encyphered File from position 0 for 16 bytes.
From some java library we could figure out, that we have to use the encyphered Command as IV for the decryption. Is that right?
Here the examples:
SessionKey: 0ba1caf83a26a72170149b7504895f34
ReadCommand: bd00000000100000
Crc32C for Cmd: D9CEE76B
Secret: 6a0d0f0d5c8f054b1e5914a42e49728622774c6272e5c34a69ed302251576aaf
So now we concat the ReadCommand with Crc32C:
bd00000000100000D9CEE76B
Then we padd Zeros up to 16 Bytes
bd00000000100000D9CEE76B00000000
Then we generate e(cmd + crc + padding) with session key and IV 0 to gain the next iv for decrypting the resposne:
77E24803B5401C61F657607923E5A318
Now we decrypt the secret with session Key and IV:=e(cmd + crc32) and are getting:
D1E9A4726C2A5C3FD5938E714C07524EF1F74BD9000000000000000000000000
There are many Zeros which let me think we are not far away from the answer.
So please someone tell us, what is wrong?
We are using this library for Crc32C
And here the full code we are using within a test:
[Theory]
[InlineData("6a0d0f0d5c8f054b1e5914a42e49728622774c6272e5c34a69ed302251576aaf", "0ba1caf83a26a72170149b7504895f34", "bd00000000100000")]
public void DecryptData_Test(string secretS, string sessionKeyS, string cmdS)
{
var cryptoAlgoFactory = new Func<SymmetricAlgorithm>(() => Aes.Create());
var keyLength = 16;
var secret = EncriptionHelper.StringToByte(secretS);
var sessionKey = EncriptionHelper.StringToByte(sessionKeyS);
var cmd = EncriptionHelper.StringToByte(cmdS);
var crytoAlgo = cryptoAlgoFactory();
crytoAlgo.Mode = CipherMode.CBC;
crytoAlgo.Padding = PaddingMode.None;
var encryptor = crytoAlgo.CreateEncryptor(sessionKey, new byte[keyLength]);
var crc32 = BitConverter.GetBytes(Crc32C.Crc32CAlgorithm.Compute(cmd));
var padding = 0;
if ((cmd.Length + crc32.Length) % keyLength != 0)
padding = keyLength - ((cmd.Length + crc32.Length) % keyLength);
var result = new byte[cmd.Length + crc32.Length + padding];
Array.Copy(cmd, result, cmd.Length);
Array.Copy(crc32, 0, result, cmd.Length, crc32.Length);
var iv = encryptor.TransformFinalBlock(result, 0, result.Length);
crytoAlgo = cryptoAlgoFactory();
crytoAlgo.Mode = CipherMode.CBC;
crytoAlgo.Padding = PaddingMode.None;
var decryptor = crytoAlgo.CreateDecryptor(sessionKey, iv);
var plain = decryptor.TransformFinalBlock(secret, 0, secret.Length);
Assert.NotNull(plain);
}
We´ve found out, that we have to encrypt the command with CMACing and without CRC32C.
Therefore the following solution:
SessionKey: 0ba1caf83a26a72170149b7504895f34
ReadCommand: bd00000000100000
Secret: 6a0d0f0d5c8f054b1e5914a42e49728622774c6272e5c34a69ed302251576aaf
First Encrypt the cmd with CMACing to get the following IV for further decrypting (Note to use the SessionKey):
A70BEC41E95A706F11F7DA3D59F2F256
Then decrypt the secret in CBC Mode with IV cmac(cmd) to get the result:
01000030303030313233343536100300F1F74BD9000000000000000000000000
Within the CMACing there is still something wrong so we used an NuGet Packege which worked fine.

Executable binary file not working after XOR encryption and decyption

I want to encrypt an exe file (file.exe), write the encrypted version to a text file (fileenc.txt) and decrypt the data in the text file back to another exe file (filedec.exe).
file.exe and filedec.exe are the same and are expected to function the same way.
However, when I try to do this the filedec.exe does not work. Error Popup says: "This app cannot run on your PC".
Please what could be the problem?
However, when I just read the file.exe, write to fileenc.txt without encryption or decryption, and then read fileenc.txt and write data to filedec.exe without encryption or decryption, filedec.exe seems to work fine.
Also, when I try encrypting and decrypting a text file with this code, it works fine too.
But when I encrypt and decrypt an exe on the fly, filedec.exe doesn't work.
Please help me out. Thank you everyone.
Here is my full code:
Main();
function Main() {
var arrKey;
arrKey = "encryptionkey";
//Encrypt file.exe and write the encrypted form to file.txt
Crypt( "C:\\...\\file.exe", "C:\\...\\fileenc.txt", arrKey );
//Decrypt the previously encrypted file.txt and write the decrypted form to filedec.exe
Crypt( "C:\\...\\fileenc.txt", "C:\\...\\filedec.exe", arrKey );
//NOTE: file.exe and filedec.exe are expected to work fine when executed
}
function Crypt(fileIn, fileOut, key) {
var fileInRead;
//Read fileIn
var adTypeBinaryRead = 1;
var BinaryStreamRead;
BinaryStreamRead = new ActiveXObject("ADODB.Stream");
BinaryStreamRead.Type = adTypeBinaryRead;
BinaryStreamRead.Open();
BinaryStreamRead.LoadFromFile(fileIn);
fileInRead = BinaryStreamRead.Read();
//Convert fileIn binary data to string
var objRS = new ActiveXObject("ADODB.Recordset");
var DefinedSize = 1024;
var adSaveCreateOverWrite = 2;
var adFldLong = 0x80;
var adVarChar = 201;
var adTypeText = 2;
objRS.Fields.Append("filedata", adVarChar, DefinedSize, adFldLong);
objRS.Open();
objRS.AddNew();
objRS.Fields("filedata").AppendChunk(fileInRead);
var binString = objRS("filedata").value;
objRS.close();
//Make key as long as string version of fileIn
while (key.length < binString.length) {
key += key;
}
key = key;
//crypt converted string with key
var k, ss, q;
var cryptresult = "";
i = 0;
for (var index = 0; index < binString.length; index++) {
k = key.substr(i, 1);
q = binString.substr(i, 1);
ss = q.charCodeAt(0);
cryptresult = cryptresult + String.fromCharCode(q.charCodeAt(0) ^ k.charCodeAt(0));
i = i +1;
}
// write crypted string to file
var outStreamW = new ActiveXObject("ADODB.Stream");
outStreamW.Type = adTypeText;
// Charset: the default value seems to be `UTF-16` (BOM `0xFFFE` for text files)
outStreamW.Open();
outStreamW.WriteText(cryptresult);
outStreamW.Position = 0;
var outStreamA = new ActiveXObject("ADODB.Stream");
outStreamA.Type = adTypeText;
outStreamA.Charset = "windows-1252"; // important, see `cdoCharset Module Constants`
outStreamA.Open();
outStreamW.CopyTo(outStreamA); // convert encoding
outStreamA.SaveToFile(fileOut, adSaveCreateOverWrite);
outStreamW.Close();
outStreamA.Close();
}
EDIT:
More troubleshooting into my code shows that when I encrypt and decrypt file.exe ON THE FLY, and then write the decrypted data to fileenc.exe, fileenc.exe works well.
But when I encrypt file.exe and write the encrypted data to fileenc.txt and then read the fileenc.txt, decrypt the read encrypted data and write to fileenc.exe (just like in my code), fileenc.exe gets corrupted. My understanding suggests that the manner through which I write the encrypted data to fileenc.txt could be the problem here.
Please I need help, how do I go about with this.

RSA OAEP Encryption with SHA-256 fails while with SHA-1 is OK

I'm using Pkcs11Interop Library and trying to test encryption and decryption with RSA_PKCS_OAEP mechanism.
CK_RSA_PKCS_OAEP_PARAMS p = new CK_RSA_PKCS_OAEP_PARAMS();
p.HashAlg = (uint)CKM.CKM_SHA_1;
p.Mgf = (uint)CKG.CKG_MGF1_SHA1;
p.Source = (uint)CKZ.CKZ_DATA_SPECIFIED;
p.SourceData = IntPtr.Zero;
p.SourceDataLen = 0;
CK_MECHANISM mech = CkmUtils.CreateMechanism(CKM.CKM_RSA_PKCS_OAEP, p);
Everything is OK with the above mechanism but if I change the hash algorithm to SHA-256 like below:
CK_RSA_PKCS_OAEP_PARAMS p = new CK_RSA_PKCS_OAEP_PARAMS();
p.HashAlg = (uint)CKM.CKM_SHA256;
p.Mgf = (uint)CKG.CKG_MGF1_SHA256;
p.Source = (uint)CKZ.CKZ_DATA_SPECIFIED;
p.SourceData = IntPtr.Zero;
p.SourceDataLen = 0;
CK_MECHANISM mech = CkmUtils.CreateMechanism(CKM.CKM_RSA_PKCS_OAEP, p);
Then I get CKR_ARGUMENTS_BAD exception. I have been searching and debugging for a while but found nothing.
I had the same problem with Luna HSM (but was given CKR_MECHANISM_PARAM_INVALID).
That version of HSM simply did not support OAEP with SHA-256 and firmware upgrade was needed. After firmware upgrade it worked without any problems. Check if your device supports this variant.
Your code seems ok, I used (in java):
CK_RSA_PKCS_OAEP_PARAMS mechanismParams = new CK_RSA_PKCS_OAEP_PARAMS(
CKM.SHA_1,
CKG.MGF1_SHA1,
new CK_RSA_PKCS_OAEP_SOURCE_TYPE(CKZ.DATA_SPECIFIED.longValue())
, null, 0
);
and
CK_RSA_PKCS_OAEP_PARAMS mechanismParams = new CK_RSA_PKCS_OAEP_PARAMS(
CKM.SHA256,
CKG.MGF1_SHA256,
new CK_RSA_PKCS_OAEP_SOURCE_TYPE(CKZ.DATA_SPECIFIED.longValue())
, null, 0
);
Good luck!

In C#, how do I create an invalid X509Chain?

The X509ChainStatusFlags enum contains a lot of possible values: https://learn.microsoft.com/en-us/dotnet/api/system.security.cryptography.x509certificates.x509chainstatusflags?view=netframework-4.8
Are there easy ways to construct a certificate and chain that produce some of these flags? I want to construct them in order to integration-test my certificate validation logic.
Each different kind of failure requires a different amount of work to test for. Some are easy, some require heroic effort.
The easiest: error code 1: X509ChainStatusFlags.NotTimeValid.
X509Certificate2 cert = ...;
X509Chain chain = new X509Chain();
chain.ChainPolicy.VerificationTime = cert.NotBefore.AddSeconds(-1);
bool valid = chain.Build();
// valid is false, and the 0 element will have NotTimeValid as one of the reasons.
Next up: X509ChainStatusFlags.NotValidSignature.
X509Certificate2 cert = ...;
byte[] certBytes = cert.RawData;
// flip all the bits in the last byte
certBytes[certBytes.Length - 1] ^= 0xFF;
X509Certificate2 badCert = new X509Certificate2(certBytes);
chain.ChainPolicy.ApplicationPolicy.Add(new Oid("0.0", null));
bool valid = chain.Build();
// valid is false. On macOS this results in PartialChain,
// on Windows and Linux it reports NotValidSignature in element 0
Next up: X509ChainStatusFlags.NotValidForUsage.
X509Certificate2 cert = ...;
X509Chain chain = new X509Chain();
chain.ChainPolicy.ApplicationPolicy.Add(new Oid("0.0", null));
bool valid = chain.Build();
// valid is false if the certificate has an EKU extension,
// since it shouldn't contain the 0.0 OID.
// and the 0 element will report NotValidForUsage.
Some of the more complicated ones require building certificate chains incorrectly, such as making an child certificate have a NotBefore/NotAfter that isn't nestled within the CA's NotBefore/NotAfter. Some of these heroic efforts are tested in https://github.com/dotnet/runtime/blob/4f9ae42d861fcb4be2fcd5d3d55d5f227d30e723/src/libraries/System.Security.Cryptography.X509Certificates/tests/DynamicChainTests.cs and/or https://github.com/dotnet/runtime/blob/4f9ae42d861fcb4be2fcd5d3d55d5f227d30e723/src/libraries/System.Security.Cryptography.X509Certificates/tests/RevocationTests/DynamicRevocationTests.cs.

Define dictionary in protocol buffer

I'm new to both protocol buffers and C++, so this may be a basic question, but I haven't had any luck finding answers. Basically, I want the functionality of a dictionary defined in my .proto file like an enum. I'm using the protocol buffer to send data, and I want to define units and their respective names. An enum would allow me to define the units, but I don't know how to map the human-readable strings to that.
As an example of what I mean, the .proto file might look something like:
message DataPack {
// obviously not valid, but something like this
dict UnitType {
KmPerHour = "km/h";
MiPerHour = "mph";
}
required int id = 1;
repeated DataPoint pt = 2;
message DataPoint {
required int id = 1;
required int value = 2;
optional UnitType theunit = 3;
}
}
and then have something like to create / handle messages:
// construct
DataPack pack;
pack->set_id(123);
DataPack::DataPoint pt = pack.add_point();
pt->set_id(456);
pt->set_value(789);
pt->set_unit(DataPack::UnitType::KmPerHour);
// read values
DataPack::UnitType theunit = pt.unit();
cout << theunit.name << endl; // print "km/h"
I could just define an enum with the unit names and write a function to map them to strings on the receiving end, but it would make more sense to have them defined in the same spot, and that solution seems too complicated (at least, for someone who has lately been spoiled by the conveniences of Python). Is there an easier way to accomplish this?
You could use custom options to associate a string with each enum member:
https://developers.google.com/protocol-buffers/docs/proto#options
It would look like this in the .proto:
extend google.protobuf.FieldOptions {
optional string name = 12345;
}
enum UnitType {
KmPerHour = 1 [(name) = "km/h"];
MiPerHour = 2 [(name) = "mph"];
}
Beware, though, that some third-party protobuf libraries don't understand these options.
In proto3, it's:
extend google.protobuf.EnumValueOptions {
string name = 12345;
}
enum UnitType {
KM_PER_HOUR = 0 [(name) = "km/h"];
MI_PER_HOUR = 1 [(name) = "mph"];
}
and to access it in Java:
UnitType.KM_PER_HOUR.getValueDescriptor().getOptions().getExtension(MyOuterClass.name);

Resources