X509Certificate2 - the system cannot find the path specified - asp.net

I wish to get the data of Google analytics via service account.
When I launch first time the application, everything works correctly and I have access to the data. But When I launch second time the application I have the following error which appears: " the system cannot find the path specified ". Have you an idea? I thought it can be a lock.
This is my source code:
public static String GetAccessToken(string clientIdEMail, string keyFilePath, String scope)
{
// certificate
var certificate = new X509Certificate2(keyFilePath, "notasecret", X509KeyStorageFlags.MachineKeySet | X509KeyStorageFlags.Exportable);
// header
var header = new { typ = "JWT", alg = "RS256" };
// claimset
var times = GetExpiryAndIssueDate();
var claimset = new
{
iss = clientIdEMail,
scope = scope,
aud = "https://accounts.google.com/o/oauth2/token",
iat = times[0],
exp = times[1],
};
JavaScriptSerializer ser = new JavaScriptSerializer();
// encoded header
var headerSerialized = ser.Serialize(header);
var headerBytes = Encoding.UTF8.GetBytes(headerSerialized);
var headerEncoded = Convert.ToBase64String(headerBytes);
// encoded claimset
var claimsetSerialized = ser.Serialize(claimset);
var claimsetBytes = Encoding.UTF8.GetBytes(claimsetSerialized);
var claimsetEncoded = Convert.ToBase64String(claimsetBytes);
// input
var input = headerEncoded + "." + claimsetEncoded;
var inputBytes = Encoding.UTF8.GetBytes(input);
// signiture
var rsa = certificate.PrivateKey as RSACryptoServiceProvider;
var cspParam = new CspParameters
{
KeyContainerName = rsa.CspKeyContainerInfo.KeyContainerName,
KeyNumber = rsa.CspKeyContainerInfo.KeyNumber == KeyNumber.Exchange ? 1 : 2,
Flags = CspProviderFlags.UseMachineKeyStore
};
var aescsp = new RSACryptoServiceProvider(1024,cspParam) { PersistKeyInCsp = false };
var signatureBytes = aescsp.SignData(inputBytes, "SHA256");
var signatureEncoded = Convert.ToBase64String(signatureBytes);
// jwt
var jwt = headerEncoded + "." + claimsetEncoded + "." + signatureEncoded;
var client = new WebClient();
client.Encoding = Encoding.UTF8;
var uri = "https://accounts.google.com/o/oauth2/token";
var content = new NameValueCollection();
content["assertion"] = jwt;
content["grant_type"] = "urn:ietf:params:oauth:grant-type:jwt-bearer";
string response = Encoding.UTF8.GetString(client.UploadValues(uri, "POST", content));
JsonGoogleResponse result = (ser.Deserialize<JsonGoogleResponse>(response));
return result.access_token;
}
And this is the stack:
à System.Security.Cryptography.CryptographicException.ThrowCryptogaphicException(Int32 hr)
à System.Security.Cryptography.SafeProvHandle._FreeCSP(IntPtr pProvCtx)
à System.Security.Cryptography.SafeProvHandle.ReleaseHandle()
à System.Runtime.InteropServices.SafeHandle.InternalFinalize()
à System.Runtime.InteropServices.SafeHandle.Dispose(Boolean disposing)
à System.Runtime.InteropServices.SafeHandle.Finalize()

If you are running in IIS, you need to set "Load User Profile" to True in the application pool's advanced settings to be able to load a cert by filename & password.

So, I just had the exact same problem. I tried to solve it for almost 4 hours.
Problem was in passed path to key. Because I used the code from Google sample console application, where the path was just "key.p12" and the key was in the same directory as the exe file.
And when I wanted to create MVC application, I did not realize, that root of virtual server path can not be called just like "key.p12".
SOLUTION
Double check the path to the key. If it is MVC application (or another ASP web), then add the key file to the root and in code call the key by using Server.MapPath("key.p12").

I just had the same issue, in my case it was a space in the path. I have no idea why, but when I put the p12 file on c:\ root, it's working...

Related

Xamarin Forms: Get the path of an image file stored on the shared project?

I am trying to upload an image file as ByteArrayContent through my web service. I have added all the images to the shared project and set the build action as Embedded resource.
Following is my code:
var fileBytes = File.ReadAllBytes("Avatars." + selectedAvatar);
var byteContent = new ByteArrayContent(fileBytes);
content.Add(byteContent, "file", selectedAvatar);
When I try like above I am getting System.IO.FileNotFoundException: Could not find file "/Projectname.Avatars.ic_avatar01_xx.png"
Added the images directly inside a folder in the shared project like the below screenshot.
:
I tried changing the . with a / in the file path, like below:
var fileBytes = File.ReadAllBytes("Avatars/" + selectedAvatar);
var byteContent = new ByteArrayContent(fileBytes);
content.Add(byteContent, "file", selectedAvatar);
But in that case, I am getting the System.IO.DirectoryNotFoundException: Could not find a part of the path "/Avatars/ic_avatar01_xx.png"
What is the correct way to get the path of an image file stored on a shared project?
Also tried another approach:
string avatarFileName = "Avatars/" + selectedAvatar;
var assembly = typeof(ProfilePage).GetTypeInfo().Assembly;
var stream = assembly.GetManifestResourceStream($"{assembly.GetName().Name}.{avatarFileName}");
content.Add(stream, "file", avatarFileName);
But in the above case I am getting the below error:
If you want to upload the image with Stream , you could check the following code
private async Task<string> UploadImage(Stream FileStream)
{
HttpClient client = new HttpClient();
client.BaseAddress = new Uri("http://your.url.com/");
MultipartFormDataContent form = new MultipartFormDataContent();
HttpContent content = new StringContent("fileToUpload");
form.Add(content, "fileToUpload");
content = new StreamContent(FileStream);
content.Headers.ContentDisposition = new ContentDispositionHeaderValue("form-data")
{
Name = "fileToUpload",
FileName = "xxx.png"
};
form.Add(content);
var response = await client.PostAsync("http://your.url.com/", form);
return response.Content.ReadAsStringAsync().Result;
}
Option 2:
You could also use the plugin FileUploaderPlugin . It support uploading multiple files at once
Uploading from a file path
CrossFileUploader.Current.UploadFileAsync("<URL HERE>", new FilePathItem("<REQUEST FIELD NAME HERE>","<FILE PATH HERE>"), new Dictionary<string, string>()
{
{"<HEADER KEY HERE>" , "<HEADER VALUE HERE>"}
}
);
Option 3:
The first parameter of MultipartFormDataContent is HttpContent. To handle the stream, try using the StreamContent type which inherits from the HttpContent. Get the streamContent from the stream and add id to the MultipartFormDataContent.
string avatarFileName = "Avatars." + selectedAvatar;
var assembly = typeof(ProfilePage).GetTypeInfo().Assembly;
var stream = assembly.GetManifestResourceStream($"{assembly.GetName().Name}.{avatarFileName}");
var streamContent = new StreamContent(stream);
content.Add(streamContent, "file", avatarFileName);

ITfoxtec SAML 2.0 encrypt assertion

Is it possible to encrypt the assertion response with ITfoxtec Identity Saml2 (open source - https://itfoxtec.com/identitysaml2)? Haven't been able to find anything.
The GitHub site (https://github.com/ITfoxtec/ITfoxtec.Identity.Saml2) mentions decrypting but not encrypting. Doesn't seem to be any examples on encrypting either.
Any help is appreciated. Thanks.
In saml2postbinding class, replace BindInternal method with below code.
protected override Saml2PostBinding BindInternal(Saml2Request saml2RequestResponse, string messageName)
{
BindInternal(saml2RequestResponse);
var element1 = XmlDocument.CreateElement("saml2", "EncryptedAssertion", "urn:oasis:names:tc:SAML:2.0:assertion");
XmlDocument xmlDoc = new XmlDocument();
var assertionElements = XmlDocument.DocumentElement.SelectNodes($"//*[local-name()='{Saml2Constants.Message.Assertion}']");
var assertionElement = (assertionElements[0] as XmlElement).ToXmlDocument().DocumentElement;
var certificate = ITfoxtec.Identity.Saml2.Util.CertificateUtil.Load(#"F:\IT-FoxTec-Core Copy\ITfoxtec.Identity.Saml2-master (1)\ITfoxtec.Identity.Saml2-master\test\TestIdPCore\itfoxtec.identity.saml2.testwebappcore_Certificate.crt");
var wrappedAssertion = $#"<saml2:EncryptedAssertion xmlns:saml2=""urn:oasis:names:tc:SAML:2.0:assertion"">{assertionElement.OuterXml}</saml2:EncryptedAssertion>";
xmlDoc.LoadXml(wrappedAssertion);
var elementToEncrypt = (XmlElement)xmlDoc.GetElementsByTagName("Assertion", Saml2Constants.AssertionNamespace.OriginalString)[0];
element1.InnerXml = wrappedAssertion.ToXmlDocument().DocumentElement.SelectNodes($"//*[local-name()='{Saml2Constants.Message.Assertion}']")[0].OuterXml;
var element2 = wrappedAssertion.ToXmlDocument().DocumentElement;
var childNode = XmlDocument.GetElementsByTagName("Assertion", Saml2Constants.AssertionNamespace.OriginalString)[0];
XmlDocument.DocumentElement.RemoveChild(childNode);
var status = XmlDocument.DocumentElement[Saml2Constants.Message.Status, Saml2Constants.ProtocolNamespace.OriginalString];
XmlDocument.DocumentElement.InsertAfter(element1, status);
if (certificate == null) throw new ArgumentNullException(nameof(certificate));
var encryptedData = new EncryptedData
{
Type = EncryptedXml.XmlEncElementUrl,
EncryptionMethod = new EncryptionMethod(EncryptedXml.XmlEncAES256Url)
};
var algorithm = true ? EncryptedXml.XmlEncRSAOAEPUrl : EncryptedXml.XmlEncRSA15Url;
var encryptedKey = new EncryptedKey
{
EncryptionMethod = new EncryptionMethod(algorithm),
};
var encryptedXml = new EncryptedXml();
byte[] encryptedElement;
using (var encryptionAlgorithm = new AesCryptoServiceProvider())
{
encryptionAlgorithm.KeySize = 256;
encryptedKey.CipherData = new CipherData(EncryptedXml.EncryptKey(encryptionAlgorithm.Key, (RSA)certificate.PublicKey.Key, true));
encryptedElement = encryptedXml.EncryptData(elementToEncrypt, encryptionAlgorithm, false);
}
encryptedData.CipherData.CipherValue = encryptedElement;
encryptedData.KeyInfo = new KeyInfo();
encryptedData.KeyInfo.AddClause(new KeyInfoEncryptedKey(encryptedKey));
EncryptedXml.ReplaceElement((XmlElement)xmlDoc.GetElementsByTagName("Assertion", Saml2Constants.AssertionNamespace.OriginalString)[0], encryptedData, false);
EncryptedXml.ReplaceElement((XmlElement)XmlDocument.GetElementsByTagName("Assertion", Saml2Constants.AssertionNamespace.OriginalString)[0], encryptedData, false);
if ((!(saml2RequestResponse is Saml2AuthnRequest) || saml2RequestResponse.Config.SignAuthnRequest) && saml2RequestResponse.Config.SigningCertificate != null)
{
Cryptography.SignatureAlgorithm.ValidateAlgorithm(saml2RequestResponse.Config.SignatureAlgorithm);
XmlDocument = XmlDocument.SignDocument(saml2RequestResponse.Config.SigningCertificate, saml2RequestResponse.Config.SignatureAlgorithm, CertificateIncludeOption, saml2RequestResponse.Id.Value);
}
PostContent = string.Concat(HtmlPostPage(saml2RequestResponse.Destination, messageName));
return this;
}
Here certificate is public key certificate for any relying party.
I m sorry to say that assertion response encryption is currently not supported.
You are Welcome to create an issue on the missing encryption funktionalitet.
If you implement the functionality please share the code.

Upload file into S3 with AWS SDK ASP.NET

I am trying to upload an image from ASP.NET to S3. I am using AWS SDK for that and have already set up what is needed. However, after i run my project, i received an error. I'll be replacing my bucket name to ... for this sample code.
I set up my secretkey and accesskey from User in my Web.config. Please do tell me if u need more codes. I need help.
controller
private static readonly string _awsAccessKey = ConfigurationManager.AppSettings["AWSAccessKey"];
private static readonly string _awsSecretKey = ConfigurationManager.AppSettings["AWSSecretKey"];
[HttpPost]  
        public ActionResult UploadFile(HttpPostedFileBase file)  
        {
try  
            {
if (file.ContentLength > 0)
{
IAmazonS3 client;
using (client = Amazon.AWSClientFactory.CreateAmazonS3Client(_awsAccessKey, _awsSecretKey))
{
PutObjectRequest request = new PutObjectRequest
{
BucketName = "...",
CannedACL = S3CannedACL.PublicRead,
Key = "images/" + (DateTime.Now.ToBinary() + "-" + file.FileName),
FilePath = Server.MapPath("~/UploadedFiles")
};
client.PutObject(request);
}
}
imageUrls = "File Uploaded Successfully!!";
System.Diagnostics.Debug.WriteLine("File Uploaded Successfully!!");
return Json(imageUrls);
            }  
            catch  
            {  
                ViewBag.Message = "File upload failed!!";
System.Diagnostics.Debug.WriteLine("File upload failed!!");
return Json(ViewBag.Message);  
            }  
        }
You're getting the error due to DateTime.Now.ToBinary() which contains invalid characters to be used in a URL. For example, you could use a GUID or a Unix timestamp instead.
Also, the FilePath property you're assigning to the PutObjectRequest is the full path and name to a file to be uploaded. So, you don't need it when you already have HttpPostedFileBase as an input parameter, which contains the InputStream property (i.e., the stream object).
Your PutObjectRequest should look something like this:
.
.
.
Guid guid = Guid.NewGuid();
// Create a client
AmazonS3Client client = new AmazonS3Client(_awsAccessKey, _awsSecretKey);
// Create a PutObject request
PutObjectRequest request = new PutObjectRequest
{
BucketName = "...",
CannedACL = S3CannedACL.PublicRead,
Key = "images/" + guid + "-" + file.FileName
};
using (System.IO.Stream inputStream = file.InputStream)
{
request.InputStream = inputStream;
// Put object
PutObjectResponse response = client.PutObject(request);
}
.
.
.
I finally solved it. I realized i did not place region in AWSClientFactory, right at the end after the keys.

401 System.UnauthorizedAccessException when access Dropbox With SharpBox API

The code
config = CloudStorage.GetCloudConfigurationEasy(nSupportedCloudConfigurations.DropBox)
as DropBoxConfiguration;
//config.AuthorizationCallBack = new Uri("http://localhost:61926/DBoxDemo.aspx");
requestToken = DropBoxStorageProviderTools.GetDropBoxRequestToken(config, "KEY", "SECRET");
//Session["requestToken"] = requestToken;
string AuthoriationUrl = DropBoxStorageProviderTools.GetDropBoxAuthorizationUrl(
config, requestToken);
Process.Start(AuthoriationUrl);
accessToken = DropBoxStorageProviderTools.ExchangeDropBoxRequestTokenIntoAccessToken(
config, "xxxxxxxxxxxxx", "xxxxxxxxxxxxx", requestToken);
CloudStorage dropBoxStorage = new CloudStorage();
var storageToken = dropBoxStorage.Open(config, accessToken);
var publicFolder = dropBoxStorage.GetFolder("/");
// upload a testfile from temp directory into public folder of DropBox
String srcFile = Environment.ExpandEnvironmentVariables(#"C:\Test\MyTestFile.txt");
var rep = dropBoxStorage.UploadFile(srcFile, publicFolder);
MessageBox.Show("Uploaded Successfully..");
**dropBoxStorage.DownloadFile("/MyTestFile.txt",
Environment.ExpandEnvironmentVariables("D:\\test"));**
MessageBox.Show("Downloaded Successfully..");
dropBoxStorage.Close();
This is the Error shown in Visual Studio.
SharpBox has a bug that only occurs in .NET 4.5, because the behavior of the class System.Uri has changed from 4.0 to 4.5.
The method GetDownloadFileUrlInternal() in DropBoxStorageProviderService.cs generates an incorrect URL, because it changes a slash in %2f. In .NET 4.0, this URL will be converted correctly back through the System.Uri object in the method GenerateSignedUrl() in OAuthUrlGenerator.cs.
I have changed the method GetDownloadFileUrlInternal() from this...
public static String GetDownloadFileUrlInternal(IStorageProviderSession session, ICloudFileSystemEntry entry)
{
// cast varibales
DropBoxStorageProviderSession dropBoxSession = session as DropBoxStorageProviderSession;
// gather information
String rootToken = GetRootToken(dropBoxSession);
String dropboxPath = GenericHelper.GetResourcePath(entry);
// add all information to url;
String url = GetUrlString(DropBoxUploadDownloadFile, session.ServiceConfiguration) + "/" + rootToken;
if (dropboxPath.Length > 0 && dropboxPath[0] != '/')
url += "/";
url += HttpUtilityEx.UrlEncodeUTF8(dropboxPath);
return url;
}
...to this...
public static String GetDownloadFileUrlInternal(IStorageProviderSession session, ICloudFileSystemEntry entry)
{
// cast varibales
DropBoxStorageProviderSession dropBoxSession = session as DropBoxStorageProviderSession;
// gather information
String rootToken = GetRootToken(dropBoxSession);
// add all information to url;
String url = GetUrlString(DropBoxUploadDownloadFile, session.ServiceConfiguration) + "/" + rootToken;
ICloudFileSystemEntry parent = entry.Parent;
String dropboxPath = HttpUtilityEx.UrlEncodeUTF8(entry.Name);
while(parent != null)
{
dropboxPath = HttpUtilityEx.UrlEncodeUTF8(parent.Name) + "/" + dropboxPath;
parent = parent.Parent;
}
if (dropboxPath.Length > 0 && dropboxPath[0] != '/')
url += "/";
url += dropboxPath;
return url;
}
and currently it works with .NET 4.5. It may exist a better way to fix the problem, but currently no misconduct noticed.

How to consume a LOB Adapter SDK-based design-time interfaces

I'm trying to build a web-based GUI to consume custom LOB Adapter SDK-based connectors.
In particular, I would like to browse the metadata using the IMetadataResolverHandler interface.
I'm having two problems:
The first problem happens when trying to instantiate the custom adapter. My plan is to obtain an instance of the IConnectionFactory interface, through which I could get a new IConnection and connect to the target LOB system.
Since the most interesting methods in the Adapter base class are protected, I can only seem to succeed using reflection (please, see the sample code below).
The second problem happens when trying to browse the metadata from the target system. The method Browse on the IMetadataResolverHandler interface expects an instance of a MetadataLookup object that I have no idea how to obtain.
Please, see the sample code below:
static void Main(string[] args)
{
var extension = new SqlAdapterBindingElementExtensionElement();
var adapter = (Adapter) Activator.CreateInstance(extension.BindingElementType);
var isHandlerSupportedMethodInfo = adapter.GetType().GetMethod("IsHandlerSupported", BindingFlags.NonPublic | BindingFlags.Instance);
var buildConnectionUri = adapter.GetType().GetMethod("BuildConnectionUri", BindingFlags.NonPublic | BindingFlags.Instance);
var buildConnectionFactory = adapter.GetType().GetMethod("BuildConnectionFactory", BindingFlags.NonPublic | BindingFlags.Instance);
if (isHandlerSupportedMethodInfo == null || buildConnectionUri == null || buildConnectionFactory == null)
{
Console.WriteLine("Not a LOB adapter.");
Environment.Exit(1);
}
var isHandlerSupportedTHandler = isHandlerSupportedMethodInfo.MakeGenericMethod(typeof(IMetadataResolverHandler));
var isMetadataBrowseSupported = (bool)isHandlerSupportedTHandler.Invoke(adapter, new object[] { });
if (!isMetadataBrowseSupported)
{
Console.WriteLine("Metadata retrieval not supported.");
Environment.Exit(1);
}
var bindingElement = (SqlAdapterBindingElement)adapter;
bindingElement.AcceptCredentialsInUri = false;
bindingElement.InboundOperationType = InboundOperation.TypedPolling;
bindingElement.PolledDataAvailableStatement = "EXEC [dbo].[usp_IsDataAvailable]";
bindingElement.PollingStatement = "EXEC [dbo].[usp_SelectAvailableData]";
bindingElement.PollingIntervalInSeconds = 10;
var binding = new CustomBinding();
binding.Elements.Add(adapter);
var parameters = new BindingParameterCollection();
var context = new BindingContext(binding, parameters);
var credentials = new ClientCredentials();
credentials.UserName.UserName = "username";
credentials.UserName.Password = "password";
var address = (ConnectionUri) buildConnectionUri.Invoke(adapter, new []{ new Uri("mssql://azure.database.windows.net//SampleDb?InboundId=uniqueId")});
var connectionFactory = (IConnectionFactory)buildConnectionFactory.Invoke(adapter, new object[] { address, credentials, context });
var connection = connectionFactory.CreateConnection();
connection.Open(TimeSpan.MaxValue);
MetadataLookup lookup = null; // ??
var browser = connection.BuildHandler<IMetadataBrowseHandler>(lookup);
connection.Close(TimeSpan.MaxValue);
}
Answering my own question, I figured it out by inspecting the code of the "Consume Adapter Service" wizard. The key is to use the IMetadataRetrievalContract interface which, internally, is implemented using up to three LOB-SDK interfaces, and in particular IMetadataResolverHandler.
Here is code that works without reflection:
var extension = new SqlAdapterBindingElementExtensionElement();
var adapter = (Adapter) Activator.CreateInstance(extension.BindingElementType);
var bindingElement = (SqlAdapterBindingElement)adapter;
bindingElement.AcceptCredentialsInUri = false;
bindingElement.InboundOperationType = InboundOperation.TypedPolling;
bindingElement.PolledDataAvailableStatement = "EXEC [dbo].[usp_IsDataAvailable]";
bindingElement.PollingStatement = "EXEC [dbo].[usp_SelectAvailableData]";
bindingElement.PollingIntervalInSeconds = 10;
var binding = new CustomBinding();
binding.Elements.Add(adapter);
const string endpoint = "mssql://azure.database.windows.net//SampleDb?InboundId=unique";
var factory = new ChannelFactory<IMetadataRetrievalContract>(binding, new EndpointAddress(new Uri(endpoint)));
factory.Credentials.UserName.UserName = "username";
factory.Credentials.UserName.Password = "password";
factory.Open();
var channel = factory.CreateChannel();
((IChannel)channel).Open();
var metadata = channel.Browse(MetadataRetrievalNode.Root.DisplayName, 0, Int32.MaxValue);
((IChannel) channel).Close();
factory.Close();

Resources