Error "The parameter KeyVault Certificate has an invalid value" with App Service Certificate - azure-resource-manager

I have created in my Azure Key Vault a secret containing an ssl certificate converted from .pfx to base64 string. Now I try to use it to create a certificate linked to an App Service using bicep file.
resource kv 'Microsoft.KeyVault/vaults#2021-06-01-preview' = {
name: 'mykeyvault'
location: resourceGroup().location
properties: {
tenantId: tenantId
sku: {
name: 'standard'
family: 'A'
}
enabledForTemplateDeployment: true
accessPolicies: [...]
}
}
resource sslCertificateSecret 'Microsoft.KeyVault/vaults/secrets#2021-06-01-preview' = {
name: '${kv.name}/sslcert'
properties: {
attributes: {
enabled: true
}
value: <base64_string_ssl>
contentType: 'application/x-pkcs12'
}
}
resource appServicePlan 'Microsoft.Web/serverfarms#2021-01-15' = {
name: 'myServiceplan'
location: resourceGroup().location
kind: 'linux'
properties: {
reserved: true
}
sku: {
name: 'B1'
}
}
resource sslCertificate 'Microsoft.Web/certificates#2021-01-15' = {
name: 'myCertificate'
location: resourceGroup().location
properties: {
keyVaultId: <my_keyvaultId>
keyVaultSecretName: <my_keyvaultCertificateSecretName>
serverFarmId: appServicePlan.id
}
}
I also tried to import the certificate manually in the key vault and reexport it to ensure the base64 string was correct and it seemed ok.
However I am getting the error "The parameter KeyVault Certificate has an invalid value."
Do you have an idea of what I am missing?

Azure KeyVault as a solution for secure storage of confidential information.
Two ways to authenticate a web application in KeyVault. A better is approach is to authenticate the web application using a certificate. This certificate is also deployed directly from KeyVault. This means neither the confidential information nor the keys to the vault are ever disclosed.
Please check the below steps:
Click on the below link to know steps of create certificate linked with app service from keyVault.
Loading the access certificate for your application into KeyVault
Check the File Formats of Certificates which is the major building block when importing certificates
PEM & PFX are the supported certificate formats in Azure Key Vault resource.
• .pem file format consists of 1 or more X509 certificate files.
• A server certificate (issued for your domain), a matching private key, and an optional intermediate CA can all be stored in a single file using the .pfx archive file format.
The first step is to convert any certificates used by the App Service to (and label them as) application/x-pkcs12. It might be possible to resolve the issue by reimport the certificate from a pfx file with the —password parameter (az keyvault certificate import), and then import it from the key vault to the webapp. You could use this blog as a resource.
Also, look if Cert and the Key Vault are in their original resource group.
References:
Azure Key Vault Import Certificates provided by Microsoft and GitHub Source of Deploying Azure Web App Certificate using KV
If you missed the certificate policy on upload and if generating new certificates, then try to generate in the key vault itself.
$credential = Get-Credential
login-azurermaccount -Credential $credential
$vaultName = 'my-vault-full-of-keys'
$certificateName = 'my-new-cert'
$policy = New-AzureKeyVaultCertificatePolicy -SubjectName "CN=mememe.me" -IssuerName Self -ValidityInMonths 120
Add-AzureKeyVaultCertificate -VaultName $vaultName -Name $certificateName -CertificatePolicy $policy
"The parameter KeyVault Certificate has an invalid value"
Please check that you have given permission to access the key vault for Resource Provider
Use PowerShell to enable the 'Microsoft.Web' Resource Provider directly access the azure key Vault.
Login-AzureRmAccount
Set-AzureRmContext -SubscriptionId AZURE_SUBSCRIPTION_ID
Set-AzureRmKeyVaultAccessPolicy -VaultName KEY_VAULT_NAME -ServicePrincipalName abfa0a7c-a6b6-4736-8310-5855508787cd -PermissionsToSecrets get
Sometimes this problem exists in the step of how the certificate was uploaded to the Key Vault: If using PowerShell, give full path instead of the relative path to the cert when uploading.
$pfxFilePath = "PFX_CERTIFICATE_FILE_PATH" # Change this path
Example:
$pfxFilePath = "F:\KeyVault\PrivateCertificate.pfx"
$pwd = "[2+)t^BgfYZ2C0WAu__gw["
$flag = [System.Security.Cryptography.X509Certificates.X509KeyStorageFlags]::Exportable
$collection = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2Collection
$collection.Import($pfxFilePath, $pwd, $flag)
$pkcs12ContentType = [System.Security.Cryptography.X509Certificates.X509ContentType]::Pkcs12
$clearBytes = $collection.Export($pkcs12ContentType)
$fileContentEncoded = [System.Convert]::ToBase64String($clearBytes)
$secret = ConvertTo-SecureString -String $fileContentEncoded -AsPlainText –Force
$secretContentType = 'application/x-pkcs12'
Set-AzureKeyVaultSecret -VaultName akurmitestvault -Name keyVaultCert -SecretValue $Secret -ContentType $secretContentType # Change the Key Vault name and secret name

Related

403 Error when using generated Sas token to display blobs from Azure blob storage

I've been trying to display images from Azure blob storage on my web app for a while now.
My storage account SAS token is:
?sv=2021-06-08&ss=bfqt&srt=sco&sp=rwdlacupiytfx&se=2022-12-09T08:03:09Z&st=2022-11-09T08:03:09Z&spr=https&sig=SIGNATURE_HERE
This SAS token includes all permissions and allows all resource types and services.
To generate a SAS token to view a blob, I go through the following steps:
1. Getting the blobService:
const blobService = new
BlobServiceClient(https://${storageAccountName}.blob.core.windows.net/?${storageAccountSasToken});
2. Creating a containerClient:
const containerClient = blobService.getContainerClient(containerName);
3. creating a sasOptions object:
const sasOptions = {containerName: containerName, blobName: blobName, startsOn: sasStartTime, expiresOn: sasExpiryTime, permissions: "racwdt" as unknown as BlobSASPermissions};
4. Generating SAS token with the parameters:
generateBlobSASQueryParameters(sasOptions, sharedKeyCredential).toString();
5. Sending the blobURL (with the SAS token attached) back to the user:
const blobURL = containerClient.getBlockBlobClient(blobName).url;
The problem is, when using the blobURL as src for my Image tag, I get a 403 (forbidden) error:
Server failed to authenticate the request. Make sure the value of
Authorization header is formed correctly including the signature.
the faulty blobURL in question:
https://mywebsite.blob.core.windows.net/container/profilePictures%2Fpicture.png?sv=2021-06-08&ss=bfqt&srt=sco&sp=rwdlacupiytfx&se=2022-12-09T08:03:09Z&st=2022-11-09T08:03:09Z&spr=https&sig=CITlY0uPxBCGdBeMtIxxJafJM61HQlhooR5ZnDiPHuE%3D
The Error:
AuthenticationFailed
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:df81f724-f01e-000e-593e-f41f7f000000 Time:2022-11-09T13:24:08.3305270Z
Signature did not match. String to sign used was STORAGE_ACCOUNT_NAME racwdt bfqt sc 2022-11-09T12:31:47Z 2022-12-09T20:31:47Z https 2021-06-08
Additional information:
The sasToken env variable includes "?" at the start of the string
All containers are PRIVATE.
My storage account is only accessible through a specific virtual network
My website's domain is listed on "Allowed Origins" in CORS tab, as well as localhost:3000
Uploading to Blob storage works, So Its safe to assume that the problem is solely related to the generated SAS token
Any assistance would be gladly appreciated :)
I tried in my environment and got below results:
Code:
var storage = require("#azure/storage-blob")
const accountname ="storage13261";
const key = "< Account key >";
const cred = new storage.StorageSharedKeyCredential(accountname,key);
const blobServiceClient = new storage.BlobServiceClient(`https://${accountname}.blob.core.windows.net`,cred);
const containerName="test";
const client =blobServiceClient.getContainerClient(containerName)
const blobName="nature.png";
const blobClient = client.getBlobClient(blobName);
const blobSAS = storage.generateBlobSASQueryParameters({
containerName,
blobName,
permissions: storage.BlobSASPermissions.parse("racwdt"),
startsOn: new Date(),
expiresOn: new Date(new Date().valueOf() + 86400)
},
cred
).toString();
const sasUrl= blobClient.url+"?"+blobSAS;
console.log(sasUrl);
Console:
The problem is in your SAS token where storage service is uses racwdt but in you SAS has rwdlacupiytfx that may cause to display an image.
I checked the Url + SAS token in the browser it perfectly worked.
Reference:
Grant limited access to data with shared access signatures (SAS) - Azure Storage | Microsoft Learn
Updated:
You can get both SAS and SAS-URL manually with check the permission by refer the below image.

FTP_INCORRECT_HOST_KEY in N/SFTP Module

While creating the connection from NetSuite to SFTP using N/SFTP module i'm facing an error states:
"FTP_INCORRECT_HOST_KEY","message":"Provided host key does not match
remote server's fingerprint."
I have tried checking with my server team but no hope. Can any one suggest me how to resolve this or how can i get an authorized finger print host key from server.
I have tried with Suitescript 2.0 module (N/SFTP) with the help of the tool mentioned below.
https://ursuscode.com/netsuite-tips/suitescript-2-0-sftp-tool/
/**
*#NApiVersion 2.x
#NScriptType ScheduledScript
*/
define(['N/sftp', 'N/file', 'N/runtime'],function(sftp, file,runtime) {
function execute(context)
{
var myPwdGuid = "Encrypted password by GUID";
var myHostKey = "Some long Host key around 380 characters";
// establish connection to remote FTP server
var connection = sftp.createConnection({
username: 'fuel_integration',
passwordGuid: myPwdGuid, // references var myPwdGuid
url: '59.165.215.45',//Example IP
directory: '/sftproot/TaleoSync',
restrictToScriptIds : runtime.getCurrentScript().id,
restrictToCurrentUser :false,
hostKey: myHostKey // references var myHostKey
});
// specify the file to upload using the N/file module
// download the file from the remote server
var downloadedFile = connection.download({
directory: '/sftproot/TaleoSync',
filename: 'Fuel Funnel Report_without filter.csv'
});
downloadedFile.folder = ;
downloadedFile.save();
context.response.write(' Downloaded "Fuel Funnel Report_without filter" to fileCabinet');
}
return {
execute: execute
};
});
I expect to create a connection between SFTP and NetSuite to down a file from SFTP and place it to NetSuite file cabinet.
A couple of things:
restrictToScriptIds : runtime.getCurrentScript().id,
restrictToCurrentUser :false,
Are not part of the createConnection signature. Those should have been used when you created a Suitelet to vault your credential.
However the hostkey complaint may be dealt with by using ssh-keyscan from a linux box.
ssh-keyscan 59.165.215.45
should replay with the server name then ssh-rsa then a long base64 string. Copy that string so it ends up in myHostKey and set the hostKeyType to RSA.

IdentityServer3 - Client certificate validation

I have IdentityServer3 and I'm trying to run their original samples WebHost (minimal) as the server and Console Client Credentials Flow using Certificate as the client because I want to test that the client can validate against IdS3 by using a X509 Thumbprint instead of a shared secret to get an Access Token.
The problem I'm having is that I'm getting an error response: invalid_client.
Apparently, it's because IdS3 doesn't receive the certificate on the incoming request, so it considers that the token request is invalid (I tested this by adding a custom SecretParser and checking the environment parameter and there's no ssl.ClientCertificate value which is the one X509CertificateSecretParser uses to parse it).
I'm just running both projects in 2 different instances of Visual Studio into IIS Express without modifying anything else on the projects. Is there anything that I'm missing on this matter? What else should I need to setup in order to make this work?
The first thing you need to do is to enable client certificates in IIS Express.
You do this by editing this file:
.vs\config\applicationhost.config
Change
<access sslFlags="None" />
to
<access sslFlags="Ssl, SslNegotiateCert" />
Now IIS Express supports client certificates, but it checks if the certificate is trusted as well.
The sample certificate, Client.pfx, will not work out of the box.
You can either let Windows trust the issuer of this certificate (not reccomended) or you could load an existing certificate from the certificate store with code like this:
X509Store store = new X509Store(StoreLocation.CurrentUser);
store.Open(OpenFlags.ReadOnly);
string thumb = "<thumbprint>";
X509Certificate2Collection cers = store.Certificates.Find(X509FindType.FindByThumbprint, thumb, false);
X509Certificate2 cert = null;
if (cers.Count > 0)
{
cert = cers[0];
}
store.Close();
You will also need to put the thumbprint of this certificate into the ClientSecret property in the client list on the Identity Server.
This is the sample code you will need to change:
new Client
{
ClientName = "Client Credentials Flow Client",
Enabled = true,
ClientId = "clientcredentials.client",
Flow = Flows.ClientCredentials,
ClientSecrets = new List<Secret>
{
new Secret("secret".Sha256()),
new Secret
{
Value = "<your thumbprint here>",
Type = Constants.SecretTypes.X509CertificateThumbprint,
Description = "Client Certificate"
},
},
AllowedScopes = new List<string>
{
"read",
"write"
},
Claims = new List<Claim>
{
new Claim("location", "datacenter")
}
},

Connect to HTTPS web service from Azure

I have a web role in Azure that has to connect to an SSL-secured external web service. When the application tries to connect to the web service, it's giving an error:
Could not establish trust relationship for the SSL/TLS secure channel
with authority 'certname.organization.org'.
The certificate that it needs has been uploaded to Azure as a service certificate, but for some reason it doesn't seem to be properly referencing it or using it.
Any thoughts on how to fix this?
That sounds like your service client in Azure isn't happy with the SSL certificate of the external service you're calling - do you have control of that service?
You can test this by using the following to ignore SSL errors from your client in Azure:
ServicePointManager.ServerCertificateValidationCallback =
(obj, certificate, chain, errors) => true;
I've seen this problem intermittently as well. In my case it turned out that the network connection to get the one of the root certificates would sometimes time out. Then on future requests it would work again.
I ended up writing a custom callback that would let the particular certificate I was interested in work despite the errors, without affecting validation of other certificates. The below is my code for that. As you can probably tell, I'm trying to hit the Android Cloud-to-Device Messaging endpoint, and trying to work around problems with the wildcard cert that Google uses, but it should be generalizable. This also has all the logging I used to diagnose the particular error. Even if you don't want to force validation of the certificate, the logging code could help you decide how to proceed.
private static readonly Uri PUSH_URI = new Uri("https://android.apis.google.com/c2dm/send", UriKind.Absolute);
/**
//The following function needs to be wired up in code somewhere else, like this:
ServicePointManager.ServerCertificateValidationCallback += ValidateDodgyGoogleCertificate;
**/
/// <summary>
/// Validates the SSL server certificate. Note this is process-wide code.
/// Wrote a custom one because the certificate used for Google's push endpoint is not for the correct domain. Go Google.
/// </summary>
/// <param name="sender">either a host name string, or an object derived from WebRequest</param>
/// <param name="cert">The certificate used to authenticate the remote party.</param>
/// <param name="chain">The chain of certificate authorities associated with the remote certificate.</param>
/// <param name="sslPolicyErrors">One or more errors associated with the remote certificate.</param>
/// <returns>
/// Returns a boolean value that determines whether the specified
/// certificate is accepted for authentication; true to accept or false to
/// reject.
/// </returns>
private static bool ValidateDodgyGoogleCertificate(object sender, X509Certificate cert, X509Chain chain, SslPolicyErrors sslPolicyErrors)
{
if (sslPolicyErrors == SslPolicyErrors.None)
{
// Good certificate.
return true;
}
string hostName = sender as string;
if (hostName == null)
{
WebRequest senderRequest = sender as WebRequest;
if (senderRequest != null)
{
hostName = senderRequest.RequestUri.Host;
}
}
//We want to get past the Google name mismatch, but not allow any other errors
if (sslPolicyErrors != SslPolicyErrors.RemoteCertificateNameMismatch)
{
StringBuilder sb = new StringBuilder();
sb.AppendFormat("Rejecting remote server SSL certificate from host \"{0}\" issued to Subject \"{1}\" due to errors: {2}", hostName, cert.Subject, sslPolicyErrors);
if ((sslPolicyErrors | SslPolicyErrors.RemoteCertificateChainErrors) != SslPolicyErrors.None)
{
sb.AppendLine();
sb.AppendLine("Chain status errors:");
foreach (var chainStatusItem in chain.ChainStatus)
{
sb.AppendFormat("Chain Item Status: {0} StatusInfo: {1}", chainStatusItem.Status, chainStatusItem.StatusInformation);
sb.AppendLine();
}
}
log.Info(sb.ToString());
return false;
}
if (PUSH_URI.Host.Equals(hostName, StringComparison.InvariantCultureIgnoreCase))
{
return true;
}
log.Info("Rejecting remote server SSL certificate from host \"{0}\" issued to Subject \"{1}\" due to errors: {2}", hostName, cert.Subject, sslPolicyErrors);
return false;
}
Ignoring SSL errors is one thing you can do.
But if it works on your machine, and it doesn't work on your instances it might also be that the certificate chain is incomplete on the instances. You'll need to open the certificate on your machine, go to Certification Path and export each certificate in the path.
Then, add these certificates to your project and have a startup task (.bat or .cmd file) add them to the trusted root CA:
REM Install certificates.
certutil -addstore -enterprise -f -v root Startup\Certificates\someROOTca.cer
certutil -addstore -enterprise -f -v root Startup\Certificates\otherROOTca.cer
i added the cer to the root of my project and select "Copy Always" and use the following command to make azure connect to server with SSL self sign
REM Install certificates.
certutil -addstore -enterprise -f -v root startsodev.cer

Forbidden 403 error when attaching client certificate

I am consuming some service and to consume the service provider has given a certificate.
So I have installed the certificate on LocalMachine and through following code I am attaching the certificate with the web request which i am posting to get response from the web service.
X509Certificate cert = null;
string ResponseXml = string.Empty;
// Represents an X.509 store, which is a physical store
// where certificates are persisted and managed
X509Store certStore = new X509Store(StoreName.My, StoreLocation.LocalMachine);
certStore.Open(OpenFlags.ReadOnly);
X509Certificate2Collection results =
certStore.Certificates.Find(X509FindType.FindBySubjectDistinguishedName,
Constants.CertificateName, false);
certStore.Close();
if (results != null && results.Count > 0)
cert = results[0];
else
{
ErrorMessage = "Certificate not found";
return ErrorMessage;
}
webClient.TransportSettings.ClientCertificates.Add(cert);
This works perfectly when i run the code with ASP.net Cassini (ASP.NET Developement Server).
But when i am hosting this code in IIS 7.0 it give forbidden 403 Error as response.
Please suggest.
You should maybe try this:
winhttpcertcfg -g -c LOCAL_MACHINE\MY -s (MyCertificate) -a ASPNET
As it turns out, the user who installs the certificate is automatically
granted access to the private key, I guess then in your case that would be you, so it works in the dev environment. When the web front end comes along, you are no longer the user, ASPNET is.

Resources