I have a WebForms app that uses the WindowsAzure.Storage API v3. It works fine in development and in one production environment, but I'm rolling out a new instance and any code that calls out Azure Blob Storage gives me a 403 error.
I've been fiddling with this for awhile, and it fails on any call out to Blob Storage, so rather than show my code I'll show my stack trace:
[WebException: The remote server returned an error: (403) Forbidden.]
System.Net.HttpWebRequest.GetResponse() +8525404
Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync(RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext) +1541
[StorageException: The remote server returned an error: (403) Forbidden.]
Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync(RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext) +2996
Microsoft.WindowsAzure.Storage.Blob.CloudBlobContainer.CreateIfNotExists(BlobContainerPublicAccessType accessType, BlobRequestOptions requestOptions, OperationContext operationContext) +177
ObsidianData.Azure.Storage.GetContainer(CloudBlobClient client, Containers targetContainer) in D:\Dev\nSource\Obsidian\Source\ObsidianData\Azure\Storage.vb:84
ObsidianWeb.Leads.HandleListenLink(String fileName, HyperLink link) in D:\Dev\nSource\Obsidian\Source\ObsidianWeb\Bdc\Leads.aspx.vb:188
ObsidianWeb.Leads.LoadEntity_ContactDetails(BoLead lead) in D:\Dev\nSource\Obsidian\Source\ObsidianWeb\Bdc\Leads.aspx.vb:147
ObsidianWeb.Leads.LoadEntity(BoLead Lead) in D:\Dev\nSource\Obsidian\Source\ObsidianWeb\Bdc\Leads.aspx.vb:62
EntityPages.EntityPage`1.LoadEntity() +91
EntityPages.EntityPage`1.Page_LoadComplete(Object sender, EventArgs e) +151
System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +4018
Here's what I've tried...
The AzureStorageConnectionString that fails in this environment definitely works in production
Other connection strings (from the other production environment, which works) also get a 403 here
There seemed to be an issue with timestamps in some old versions of the REST api (which I am not directly using...) so I made certain the times are correct, even tried switching the server to UTC time.
Tried toggling the connection string between http/https.
Upgraded to the latest version of the API (v3.1)
Tried fiddling with the code to ensure that every call out to Azure Storage gets 403. It does.
In desperation, Installed Azure Powershell on the server just to verify that some type of communication with Azure is working. And that worked fine.
Browsed to the azure management portal as well and that works fine.
Any ideas? This should just be using port 80 or 443, right? So there should be no way this is some kind of network issue. Let me know if that's wrong.
The working production machine is an Azure VM (Server 2008 R2 with IIS 7.5)
There are also some differences with the server:
This new machine is physical hardware (Server 2012 and IIS 8)
This IS using a different storage account inside my azure subscription, however I've tried a total of 3 connection strings and none of them work here.
UPDATE: someone asked to see the code. Okay, I wrote a class called Azure.Storage, which just abstracts my cloud storage code. We are failing on a call to Storage.Exists, so here's the part of that class that feels relevant:
Public Shared Function Exists(container As Containers, blobName As String) As Boolean
Dim Dir As CloudBlobContainer = GetContainer(container)
Dim Blob As CloudBlockBlob = Dir.GetBlockBlobReference(blobName.ToLower())
Return Blob.Exists()
End Function
Private Shared Function GetContainer(client As CloudBlobClient, targetContainer As Containers)
Dim Container As CloudBlobContainer = client.GetContainerReference(targetContainer.ToString.ToLower())
Container.CreateIfNotExists()
Container.SetPermissions(New BlobContainerPermissions() With {.PublicAccess = BlobContainerPublicAccessType.Blob})
Return Container
End Function
Private Shared Function GetCloudBlobClient() As CloudBlobClient
Dim Account As CloudStorageAccount = CloudStorageAccount.Parse(Settings.Cloud.AzureStorageConnectionString())
Return Account.CreateCloudBlobClient()
End Function
...Containers is just an enum of container names (there are several):
Public Enum Containers
CallerWavs
CampaignImports
Delve
Exports
CampaignImages
Logos
ReportLogos
WebLinkImages
End Enum
...Yes, they have upper-case characters, which causes problems. Everything is forced to lowercase before it goes out.
Also I did verify that the correct AzureConnectionString is coming out of my settings class. Again, I tried a few that work elsewhere. And this one works elsewhere also!
Please check the clock on the servers in question. Apart from the incorrect account key, you can also get 403 error if the time on the server is not in sync with the time on storage servers (Give or take +/- 15 minutes deviation is allowed).
I also ran into this error. My problem was that I had turned ON dynamic IP security restrictions in my web.config and the number of files being downloaded in some cases (e.g. with pages with lots of images) was exceeding the max thresholds I had defined in my web.config.
In my case Access key is not same as connection string using by the source code.
So try to recheck on your Azure -> [Storage Account Name] -> Access Keys -> key1 -> Key & Connection string.
Related
I had to create a new dev environment with less VMs as my IDE kept crashing, so I created a VM with AD and IIS on the same server.
I was using the following code fine in my old environment:
PrincipalContext ctx = new PrincipalContext(ContextType.Domain,
Environment.GetEnvironmentVariable("DOMAIN"),
Environment.GetEnvironmentVariable("USER_OU"),
Environment.GetEnvironmentVariable("SERVICE_USERNAME"),
Environment.GetEnvironmentVariable("SERVICE_PASSWORD"));
UserPrincipalEx usr = new UserPrincipalEx(ctx);
usr.Name = ticket.FirstName + " " + ticket.LastName;
usr.SamAccountName = ticket.Username;
usr.GivenName = ticket.FirstName;
usr.Surname = ticket.LastName;
usr.DisplayName = ticket.FirstName + " " + ticket.Account.LastName;
usr.UserPrincipalName = ticket.Username + "#" + Environment.GetEnvironmentVariable("DOMAIN");
usr.Enabled = enabled;
try
{
usr.Save();
usr.SetPassword(temppwd);
usr.ExpirePasswordNow();
}
I can still save the user and it appears in AD, however SetPassword no longer works:
[IIS EXPRESS] Request started: "POST" https://localhost:5001/create
System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.UnauthorizedAccessException: Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))
--- End of inner exception stack trace ---
at System.DirectoryServices.DirectoryEntry.Invoke(String methodName, Object[] args)
at System.DirectoryServices.AccountManagement.SDSUtils.SetPassword(DirectoryEntry de, String newPassword)
at System.DirectoryServices.AccountManagement.ADStoreCtx.SetPassword(AuthenticablePrincipal p, String newPassword)
at System.DirectoryServices.AccountManagement.PasswordInfo.SetPassword(String newPassword)
at System.DirectoryServices.AccountManagement.AuthenticablePrincipal.SetPassword(String newPassword)
My service account is a Domain Admin and I get the same error if I try my own AD creds.
I have tired calling SetPassword() before Save() but it fails at the same point.
The only difference is that I have AD and IIS on the same server. I have tired both Jetbrains Rider and VS2019. I am getting very close to my project deadline and I am really stuck.
None of the users have 'User Cannot Change Password Set' and the new users don't have any options under 'Account Options' set.
SetPassword sets the unicodePwd attribute. That has some restrictions on when it can be updated. The documentation for that says:
Windows 2000 operating system servers require that the client have a 128-bit (or better) SSL/TLS-encrypted connection to the DC in order to modify this attribute. On Windows Server 2003 operating system and later, the DC also permits modification of the unicodePwd attribute on a connection protected by 128-bit (or better) Simple Authentication and Security Layer (SASL)-layer encryption instead of SSL/TLS.
It should setup a secure connection by default (it does for me), but it's possible that it can't in your setup for whatever reason.
You can pass a ContextOptions object in the constructor to your PrincipalContext. By default that is automatically set to ContextOptions.Negotiate | ContextOptions.Signing | ContextOptions.Sealing, which should be secure. But ContextOptions.Negotiate uses "either Kerberos or NTLM", and ContextOptions.Signing (the encryption) depends on Kerberos. So maybe it's falling back to NTLM and can't encrypt.
You might be able to confirm this by inspecting these values, after you create the account:
Console.WriteLine(ctx.Options);
Console.WriteLine(((DirectoryEntry) usr.GetUnderlyingObject()).AuthenticationType);
The values you'd be looking for are:
Negotiate, Signing, Sealing
Secure, Signing, Sealing
That's what I have when SetPassword works. But I'm not sure if it actually changes those values if it falls back to NTLM. Sometimes it does that pretty silently.
In any case, if Kerberos isn't happening, you can either troubleshoot that, or attempt to connect via LDAPS (LDAP over SSL). That would look something like this:
PrincipalContext ctx = new PrincipalContext(ContextType.Domain,
Environment.GetEnvironmentVariable("DOMAIN") + ":636",
Environment.GetEnvironmentVariable("USER_OU"),
ContextOptions.Negotiate | ContextOptions.SecureSocketLayer,
Environment.GetEnvironmentVariable("SERVICE_USERNAME"),
Environment.GetEnvironmentVariable("SERVICE_PASSWORD"));
But that can cause other issues since your DC needs to have a certificate that you trust.
everyone,
I am developing a web application that uses X509Certificate2 to get a private key from a certification file. Code snippet looks like following:
public static RSACryptoServiceProvider GetSignProviderFromPfx()
{
var strFileName = "c:\cer\mycerfile.pfx";
var strPassword = "000000";
X509Certificate2 pc = new X509Certificate2(strFileName, strPassword, X509KeyStorageFlags.MachineKeySet);
var ThePivateKey = pc.PrivateKey;
return (RSACryptoServiceProvider)ThePivateKey;
}
But the statement pc.Privatekey causes a System.Security.Cryptography.CryptographicException "Invalid provider type specified" . I'm sure the certification file has no problem, it really has a private key. And the property pc.HasPrivateKey is also return true.
The test environment is VS2013, window 7.
I also tried following:
a. I debugged it in VS2013 with iis express, the problem occured.
b. I debugged it in another computer with same enviroment with mine, the problem occured too.
c. I published the application to a server with iis running on Windows Web Server 2008 R2, it worked fine.
d. I published the application to widows azure website, it also worked fine.
Therefore, I guess the code snippet has no problem. The key reason raising the exception is that there may be some problem about running environment. I checked and compared the reading/writing right on the certification file in different environment, all of them are same.
Anybody can help?
Thanks.
I have been having the same problem and this is what I did to solve it. Hopefully this will help you too.
We had to set Load User Profile to True in the app pools advanced settings
You can also set it in the web.config I believe.
https://blogs.msdn.microsoft.com/vijaysk/2009/03/08/iis-7-tip-3-you-can-now-load-the-user-profile-of-the-application-pool-identity/
may be it useful to you;
On local try:
X509Certificate2 pc = new X509Certificate2(strFileName, strPassword);
I've an application that connects to Reporting Services in SQL Server 2008 R2.
The error is the following:
System.Net.WebException: The request failed with HTTP status 401: Unauthorized.
at System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse
(SoapClientMessage message, WebResponse response, Stream responseStream,
Boolean asyncCall)
at System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke
(String methodName, Object[] parameters)
at Microsoft.SqlServer.ReportingServices2005.Execution.ReportExecutionService.LoadReport
(String Report, String HistoryID)
The application is running in production fine in 2 different customers, so it's not a codeing issue.
I'm trying to install it now on a customer's server, which is using AD. The SQL Server and the IIS is all in the same machine though, so I don't really care about AD.
It runs if I run IE as Administrator, but it doesn't work with other users. The ASP.NET app is connecting to SSRS using a user created in the local machine (called ReportingServicesUser), member of the ReportingServicesUser group.
Things I've tried:
Adding ReportingServicesUser to the Site Settings in the RS website (did the same for Network Service, IUSR, the Authenticated Users group, Local Service, etc)
Adding ReportingServicesUser to the folder permissions in the RS website (did the same for Network Service, IUSR, the Authenticated Users group, Local Service, etc)
Added permissions for that users to the databases (app database and RS related dbs)
Added NTFS permissions to the RS folders (I will double check though).
Connecting to the RS using http://localhost, http://computername and http://domain.com
For reference, the code is this (simplified version):
var service = new ReportExecutionService();
service.Credentials = new NetworkCredential("ReportingServicesUser", "password");
service.Url = "http://computername:90/ReportServer/ReportExecution2005.asmx";
service.ExecutionHeaderValue = new ExecutionHeader();
var execInfo = new ExecutionInfo();
execInfo = service.LoadReport("path-to-the-report", null);
===> Here it throws the exception
I've read a lot of posts and pages about this but I cannot get an answer that works for me.
OK, I've finally had to change the code to:
rs.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
and it worked. Probably there's another solution but I couldn't find it out.
Are you sure you gave "ReportingServicesUser" browse permissions for the specific report in ssrs? The request never made it to the server it seems but I would check \Reporting Services\LogFiles just to be certain.
Also, Your "report user" user needs to be defined on the reporting server with the credentials you send.
I'm trying to start a new process from my WCF Service. For that purpose I use
var process = Process.Start(
new ProcessStartInfo { WorkingDirectory = config.WorkingDirectory,
FileName = config.WorkingDirectory,
Arguments = string.Format("{0} {1}", mpcName, jobId),
CreateNoWindow = false,
WindowStyle = ProcessWindowStyle.Hidden });
The WebApp is using a separate AppDomain whose Identity is set to a user account having administrator rights on the server.
Process.Start throws an exception telling
Server execution failed, at System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo startInfo)
I also tested setting user and password in ProcessStartInfo. Specifying the password was quite tricky (SecureString) and then I received
The stub received bad data, at System.Diagnostics.Process.StartWithCreateProcess(ProcessStartInfo startInfo)
so I skipped this way.
Do you know what is the reason for my problem and how I can fix it.
I forgot: I'm using Windows Server 2008 R2, IIS 7
I got it!
It's very strange but the only change needed was to invoke
Process.Start(exeFullPath, args);
Obviously the combination of ProcessStartInfo props is important.
This q/a helped me fix this issue in one of my projects, but different cause --
was trying to start process as a Domain user from an integration test being run by nCrunch. Turns out MY problem was a really long argument string.
(same argument string works with no user/password)
Environment is Windows 8, 64 bit.
Anyway, just gonna have to pass the arg data a different way.
I'm having a problem getting access to a database which lives on a remote server.
I have a ASP.NET 2.0 webpage that is trying to connect to a database.
The database is accessed via a virtual folder (which I set up in IIS).
The virtual folder points at a remote share which contains the database.
The virtual folder (in the web apps root directory) is pointing at a share on a remote server via a UNC path:
\\databaseServerName\databaseFolder$\
The virtual folder has 'read' and 'browse' permissions set to 'true'.
I store the connection string in the 'appSettings' section of the web.config:
<add key="conStrVirtual" value="Provider=Microsoft.Jet.OleDb.4.0;Data Source=http://webAppServerName/virtualFolderName/databaseName.MDB;Jet OLEDB:Database Password=dumbPassword;"/>
The connection object is declard on my .aspx page:
Dim objConnVirtual As New OleDbConnection(ConfigurationManager.AppSettings("conStrVirtual"))
Here is the code that tries to use the connection object:
Public Sub Test()
If objConnVirtual.State <> ConnectionState.Open Then
objConnVirtual.Open()
End If
Dim cmd As OleDbCommand = New OleDbCommand("SELECT * FROM TableName", objConnVirtual)
objDR = cmd.ExecuteReader()
If objDR.Read() Then
response.write("Shazaam! Data shows up here")
End If
objDR.Close()
objConnVirtual.Close()
End Sub
When I run the above code I get the following error (on this line of the code 'objConnVirtual.Open()':
Exception Details: System.Data.OleDb.OleDbException: Not a valid file name.
I have checked the database name and it is correct (even copy/pasted it to make sure)
If I put the 'Data Source' section of the connection string into the address bar of my browser I can successfully see the contents of the share on the remote server.
Not sure if this is a problem with permissions or with the code.
I have googled the crap out of this but have not been able to find a solution.
Any help is much appreciated.
When accessing a remote Access MDB database, you have to specify a UNC path like \\remoteMachine\Share\test.mdb.
Make sure your application pool identity has the right permissions to connect to the remote share. By default on IIS 6 you are working with the Network Service account, which is by default not allowed to access a remote share.
The best way is to let the AppPool run with a dedicated service user.
What is the account being used on your server when your web app tries to read the db file? Whatever this user account is, it needs to have permissions to read that folder/file. In IIS6 you can configure the virtual folder to use any user account... on the Directory Security tab there's an Edit button under Authentication and access control.
It seems likely that your error message is just a generic error message, and the permissions problem is your real issue.
make sure the two servers have internal access to each other and also specify the ip & port of db server in your connection string .
Update
I should also mention that it works on my machine (but not once loaded up to the production box) if I declare the connection string in the 'appSettings' section of the web.config like this:
<add key="conStrVirtual" value="Provider=Microsoft.Jet.OleDb.4.0;Data Source=\\databaseServerName\databaseFolder$\databaseName.MDB;Jet OLEDB:Database Password=dumbPassword;"/>
This leads me to think that it could be an issue with needing to use domain credentials other than the local IUSER account.
UPDATE
First up, thank you to everyone who submitted answers.
However, we ended up not using the 'connect to remote database via virtual folder' method because the complexity of the permissions needed to get this to work was causing us more problems than it was worth. We put the UNC path back into the connection string, which may not be the best way to do this, but is working for us.