Migrating to KeyBlock LMK - payment-processing

I am trying to migrate a key from a variant LMK to a keyblock one.I am using a utility that uses the command BW for this purpose , The commannd is failing from HSM side with error code A1 = Incompatible LMK schemes.
Request:
0001BWFF1U********KEYTOBEMIGRATED*********;000%01#52N00S00
Response :
0001BXA1
I assume both LMKs are loaded properly in the key change storage so any advice what could be the issue here ?

Related

Can we use standalone Spring Cloud Schema Registry with Confluent's KafkaAvroSerializer?

I have a project using Spring cloud stream with Kafka Streams binder. For the output of a stream, I am using Avro, with the Serde provided by Confluent(io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde).
I am able to use it with the Confluent Schema Registry. Serialization and Deserialization takes place correctly.
However, I wanted to see if we can use the Spring Cloud Schema Registry Server instead of the Confluent one. I configured a standalone Schema Registry server and set the schema registry in my project to it (changed the schemaRegistryClient.endpoint and schema.registry.url properties).
When I tried it out, it seems Spring Cloud is able to work with the standalone server. It registers the schema available in the resources folder as a .avsc file. However, when I send a message, it seems the Confluent serializer continues to approach it as a Confluent Schema Registry (which has different REST endpoints from Spring Schema Registry). As a result, it gets a 405 response code.
We get the following exception(partial stack-trace)
org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: <my-avro-schema>
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unexpected character ('<' (code 60)): expected a valid value (JSON String, Number, Array, Object or token 'null', 'true' or 'false')
at [Source: (sun.net.www.protocol.http.HttpURLConnection$HttpInputStream); line: 1, column: 2]; error code: 50005
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:230)
It seems to me that there are two possibilities:
Spring Schema Registry Server can work only with the content-type provided by Spring (specified as content-type: application/*+avro) and not with the native Serde provided by Confluent, or
There is an issue with the project configuration.
Can someone help me figure out which one is it? If it is the second one, can someone point out what is wrong?
Each schema registry provider requires a proprietary SerDe library. For example, if you would like to integrate AWS Glue Schema Registry with Kafka, then you would need Amazon's SerDe stuff. Hence, the Confluent's SerDe library expects Confluent's Schema Registry at the address specified in the schema.registry.url property.

How do I specify encryption type when using s3remote for DVC

I have just started to explore DVC. I am trying with s3 as my DVC remote. I am getting
But when I run the dvc push command, I get the generic error saying
An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
which I know for a fact that I get that error when I don't specify the encryption.
It is similar to running aws s3 cp with --sse flag or specifying ServerSideEncryption when using boto3 library. How can I specify the encryption type when using DVC. Coz underneath DVC uses boto3 so there must be an easy way to do this.
Got the answer for this immediately in the DVC discord channel!! By default, no encryption is used. We should specify what server-side encryption algorithm should be used.
Running dvc remote modify worked for me!
dvc remote modify my-s3-remote sse AES256
There are a bunch of things that we can configure here. All this does is that it adds an entry of sse = AES256 under the ['remote "my-s3-remote"'] inside the .dvc/config file.
More on this here
https://dvc.org/doc/command-reference/remote/modify

artifactory no matched algorithm and key

I have an error when i start artifactory:
[art-exec-3] [ERROR] (o.j.s.c.EncryptionWrapperBase:185) - no matched algorithm and key for AES128 16uJT
[art-exec-3] [ERROR] (o.a.r.s.RepositoryServiceImpl:2626) - Failed to initialize remote repository '(name of maven repo)'. Repository will be blacked-out!
The context is:
I had a functional standalone artifactory.
I have deployed a HA artifactory on this
So with the same database and filestore
I have destroyed my standalone instance and create 2 instances for high availability.
And then i have the error above.
The artifactory service is up
i can request API but only GET, no possible changes like POST.
But no web interface reachable.
If i test a POST request:
{
"errors" : [ {
"status" : 500,
"message" : "Could not decrypt with artifactory key, due to: org.jfrog.security.crypto.KeyIdAlgCipherNotFound: no matched algorithm and key forAES128 16uJT"
} ]
It is obviously an AES128 artifactory_key problem.
But i know my artifactory_key (and my master_key) are good.
It's still the same like before HA, no reason it's different but maybe ... ?
so, what's wrong ?
Can artifactory_key change over time without my consent ?
How can i recover ?
In my database, i have a table named "configs" and artifactory_key is used to encrypt the config i think.
If i recover this table, maybe i can override ?
Except it's all encrypted, with the artifactory_key ? or the master_key ?
It's not usable as is.
Thanks to you
Is the artifactory.key in place and located at $ARTIFACTORY_HOME/etc/security/ of both nodes?
Based on the error message it seems as it fails to decrypt a string that contains "16uJT".
Can you search for this string in $ARTIFACTORY_HOME/etc/artifactory.config.latest.xml ?
In case it is, you may check what is the expected decrypted value of that entry and manually replace it. If you are not sure, you may try to leave this one blank for now and try to see if this helped.
In order to import the new config after the change, you will need to save the config as: "artifactory.config.import.xml" and then restart the server.
In case this doesn't help, kindly attach the full errors with possible stacktraces.
Thanks for your reply
The problem is: i have destroyed the old instance and recreate two new instances with HA licenses.
So i have an artifactory_key in $ARTIFACTORY_HOME/etc/security/ and it's the same on both instances. But apparently not the good one.
And the file $ARTIFACTORY_HOME/etc/artifactory.config.latest.xml is not the good one too, like instances are new.
It's a config by default, a void config.
There is no "16uJT" string.
That's why i ask if i can retrieve the good configuration directly in the database -> table "configs"
except it's encrypted, and i don't know how decrypt without a API request
Regards,

BouncyCastle updated pgp key now getting checksum mismatch error

I have a utility that is using the BouncyCastle.Crypto dll (version 1.7.4, runtime version 1.1.4), in order to decrypt a file that is given to it by another system.
I just updated the pgp key (and provided the encryptor with the new public key). The new key uses 4096 bit RSA encryption and has a 24 character password, which are the only differences I can think of between the new key and the old key. The old key used I believe 2048 bit encryption with a 7 character password.
When I attempt to decrypt a file the process is now failing when calling the PgpSecretKey.ExtractPrivateKey(char[] passPhrase) function, provided by BouncyCastle. The error is "Checksum mismatch at 0 of 20."
The weird part is that the first time I tested it worked fine, then with no changes it began failing. I have tried with multiple encrypted files.
Since it's such an old version of BouncyCastle and this particular permutation of the ExtractPrivateKey function is no longer in use I am finding it difficult to locate relevant information. Any thoughts are appreciated.
I got that error once "Checksum mismatch at 0 of 20." . My issue was due to a wrong pass phrase. Hope this should help someone.

Thales HSM Generate key "Form key from clear components" ("FK" command)

I have two clear components, generated by command 000A30303030413230303255 (it's a 000A0000A2002U in HEX mode. This is "GC" - Translate a ZPK from LMK to ZMK Encryption command from 1270A513 Issue 3 manual) using Java code
Now I need to generate an Encrypted key from those components. The console command for it: "FK" command (1270A513 Issue 3, page 5-14).
I couldn't find any commands for doing it by Java code. I used Host Command Reference manual (1270A351 Issue 6) and found only A4- Form a Key from Encrypted Components command, but this command for generating key from Encrypted components.
Is there way to generate encrypted key using clear components?
There is no way to do this and for good reason. If you were to send this via your java code it is open to attack as the clear components are being sent through the network unencrypted. Any person intercepting these components can generate the key themselves. The GC and FK commands are meant to be used with the console and not remotely which is why it is possible using those commands.
If you already have the components you can only form them at the HSM console. If you can possibly generate new keys use the A0 command from your java code.
I don't recommend using this in production. I would take following steps if i really need to do that.
Generate A ZMK(clear and encrypted) on HSM console using 'GC' and 'FK' command.(Need to do only once and reuse key).
use clear ZMK to encrypt all of your keys using TripleDES-ECB-NOPADDING in your application.
Use command 'A6'. Import all ZMK encrypted keys to LMK.
Use 'A4' command to form key using LMK encrypted Components.

Resources