I have to encrypt a text by using the DES algorythm with a hash created in MD5.
The MD5 function has the parameters: salt (byte[8]) and key (string 6), It has to iterate 1000 times. When I pass the MD5 encryption function it returns me a byte[16].
The DES function parameters are: the string to encrypt and the key (returned by the MD5 function). But when I try to assign the key value to the key encoder I get an exception because it waits a bte[8] instead of a byte[16]. I've tryed to get the first 8 bytes or the last 8 bytes..... But it doesn't work (I have an example and I have to get the same result).
Some ideas???
DES (not to be confused with 3DES) has 56 bit keys. Your problem will require more definition in order to determine the correct choice for the key.
There is no reason to use DES today. There are far better, unbroken, algorithms available.
Why are you using the hash as an encryption key? Keys should be cryptographically secure random data, something a hash is not. Hashing itself is not encryption at all.
DES keys are 56 bits normally packaged in 8 bytes, so taking the first 8 bytes from the hash means you have a key that is too long (depending on if it's signed or unsigned), you need to extract 56 bits if you must use the hash as a source.
Related
If the user enters a wrong key for AES decryption, some garbage data is generated. I want to verify the given decryption key and throw an error if the key is incorrect. How can I verify the key entered by the user?
Use an HMAC. The basic premise is that you run the plaintext through an HMAC, add the result to the plaintext and then encrypt. Then do the opposite when decrypting. If the plaintext and HMAC result match, then you know you've got the correct key.
OR, if you want to know prior to decryption, use the key material provided by the user to derive two further keys (using, say PBKDF2). Use one for encryption and another for an HMAC. In this case, encrypt first and then apply the HMAC using the second key. This way you can compute the HMAC and check if it matches before you decrypt.
Simplest approach is to add a magic number to the plaintext file data in a predictable location before encrypting; when decrypting, if the magic number is wrong, you used the wrong key. Downside to this approach is that it cannot validate the integrity/authenticity of the entire message.
To do that, use AES in an authenticated mode (e.g. AES-GCM) which gives stronger guarantees that the rest of the message was not tampered with.
One common way used to verify if a key is correctly entered, without revealing the actual key, is by use of a KCV (Key Check Value). When you create the key you would at the same time calculate the KCV, when the key is then entered manually, you can verify the entry by re-calcuylating the KCV. This is eg. used when entering keys manually into HSM's from physical key letters.
To calculate a KCV for an AES key you encrypt an empty (0x00) block with the key and the first 3 bytes of the resulting encrypted block is then the KCV.
Take a look here
There is a requirement in my new job regarding encryption. We need encrypt specific data from DB and then decrypt it whenever needed. We're planning to go for DBMS_CRYPTO with AES algorithm. Now the requirement is that whatever the input string may be and whatever be its length, my encrypted string's length should be same everytime i do the encryption. For example, a 15 character string should have the length of its encrypted string same as a 24 character string. Can someone please guide me on this?
This is part of my homework question (but this is not the actual question),
In my question professor asked me to generate a unique 56 bit key to encrypt and decrypted a message. Both the sender and receiver shares a pass-phrase (password) only. There is no key exchange here.
How to get unique 56 bit key with the help of a pass-phrase??
Can i use hash? but how to get 56 bit hash value ??
Question : (relevant part)
Messanger app:
1) Alice and Bob share the same password (or passphrase), they must use the
password to set up the tool to correctly encrypt and decrypt messages shared
between each other.
2) Each message during Internet transmission must be encrypted using a 56-bit
key
3) DONOT directly use the password as the key, you have to generate the same
key between Alice and Bob to encrypt messages.
.......
.....
The keyword is Key derivation functions
I'm looking for a method that enables a user to generate a pair of public/private keys using an initial key provided to him/her. I don't know if this is called hierarchical key generation or multilevel key generation or something else. It's not important for the higher level key to be able to decrypt the data of the lower level I just need the pair to be be generated using another key.
I have a seen some articles but they're all just theoretical. Is there a way to achieve this for RSA?
It is pretty easy actually.
The algorithm for generating an RSA key pair boils down to finding a set of big, prime numbers, that fulfil some algebraical properties and that are of appropriate size.
If you need a 2048 bit RSA key, you will typically look for 2 prime number, each having a a rough length of 1024 bits.
The process of finding a prime number is trial-and-error: you randomly pick an integer of appropriate size, and test if it is prime. If it is not, you retry.
In the real world, the random generator that drives the algorithm is a deterministic PRNG which is seeded with a secret of appropriate entropy (e.g. 128 bits of true randomness).
In your case, the PRNG seed can be derived from a user secret or even from another key (provided it is secret of course). Derivation should be performed with a salted KDF like HKDF, PBKDF2, etc.
You don't specify which crypto library you use: whatever it is, you must be clear on how it draw randomness and how to define the seed of the PRNG.
Example (in Python 2.x):
from Crypto.PublicKey import RSA
from Crypto.Hash import HMAC
from struct import pack
# The first key could also be read from a file
first_key = RSA.generate(2048)
# Here we encode the first key into bytes and in a platform-independent format.
# The actual format is not important (PKCS#1 in this case), but it must
# include the private key.
encoded_first_key = first_key.exportKey('DER')
seed_128 = HMAC.new(encoded_first_key + b"Application: 2nd key derivation").digest()
class PRNG(object):
def __init__(self, seed):
self.index = 0
self.seed = seed
self.buffer = b""
def __call__(self, n):
while len(self.buffer) < n:
self.buffer += HMAC.new(self.seed +
pack("<I", self.index)).digest()
self.index += 1
result, self.buffer = self.buffer[:n], self.buffer[n:]
return result
second_key = RSA.generate(2048, randfunc=PRNG(seed_128))
The drawbacks to keep in mind are that:
the derived key will get compromised as soon as the first key is compromised.
the derived key cannot be stronger than the first key (as in, the algorithm does not magically generate entropy. If the secret key or passphrase is short, you end up with a weak derived key.
My problem is a bit hairy, and I may be asking the wrong questions, so please bear with me...
I have a legacy MySQL database which stores the user passwords & salts for a membership system. Both of these values have been hashed using the Ruby framework - roughly like this:
hashedsalt =
Digest::SHA1.hexdigest("--#{Time.now.to_s}--#{login}--")
hashedpassword =
Digest::SHA1.hexdigest("#{hashedsalt}:#{password}")
So both values are stored as 40-character strings (varchar(40)) in MySQL.
Now I need to import all of these users into the ASP.NET membership framework for a new web site, which uses a SQL Server database. It is my understanding that the the way I have ASP.NET membership configured, the user passwords and salts are also stored in the membership database (in table aspnet_Membership) as SHA1 hashes, which are then Base64 encoded (see here for details) and stored as nvarchar(128) data.
But from the length of the Base64 encoded strings that are stored (28 characters) it seems that the SHA1 hashes that ASP.NET membership generates are only 20 characters long, rather than 40. From some other reading I have been doing I am thinking this has to do with the number of bits per character/character set/encoding or something related.
So is there some way to convert the 40-character SHA1 hashes to 20-character hashes which I can then transfer to the new ASP.NET membership data table? I'm pretty familiar with ASP.NET membership by now but I feel like I'm just missing this one piece. However, it may also be known that SHA1 in Ruby and SHA1 in .NET are incompatible, so I'm fighting a losing battle...
Thanks in advance for any insight.
The varchar representation in your Ruby app appears to be 'hex as string', something like this: 01AB02EF...23EF. that is, each byte is represented as a pair of characters that are the hex value of the byte, from 00 to FF. Therefore the SHA hash (20 bytes) is represented as 40 characters. IF the hash is the values (0, 1, 2, ...) the string will be 000102. The ASP base64 is the base64 encoding of the actual bytes. So all you need to do is take the MySQL characters and obtain the corresponding bytes, then encode them as base64.
You can actually do the transformation in SQL itself:
declare #x varchar(40);
set #x = '000102030405060708090A0B0C0D0E0F10111213';
declare #sql nvarchar(max);
set #sql = N'set #out=0x' + #x;
declare #out varbinary(20);
exec sp_executesql #sql, N'#out varbinary(20) output', #out output;
select #out for xml path('');
But to coerce your ASP.Net membership provider to use the salted hash your Ruby code created, irrelevant of the encoding used to store the hash digest, that's a whole different topic. You'll likely have to rewrite your own membership provider, at which point the storage encoding becomes irrelevant, since you can store them as you wish.
Ruby uses SHA2, and ASP.NET uses SHA1, and no, you can't 'convert' between those versions. For that you'd need to recalculate the hashes from the plaintext.
Edit: SHA is standarlized, so you can search in the internet for a SHA2 library to use in ASP.NET.