ASP.NET HttpWebRequest with Kerberos Authentication - asp.net

I am trying to connect to a web service that uses Kerberos Authentication to authorize the user, but all I get is a 401 unauthorized everytime I try to make the request. Below is the code that I am using. Thanks in advance for any help you can provide!
public XPathNavigator GSASearch(string url, string searchString)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url + searchString);
request.CookieContainer = new CookieContainer();
request.Credentials = CredentialCache.DefaultCredentials;
request.ContentType = "text/xml";
request.Method = "POST";
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream receiveStream = response.GetResponseStream();
XPathDocument doc = new XPathDocument(receiveStream);
return doc.CreateNavigator();
}
EDIT: I feel I should explain a bit more what I am attempting to do. I have been tasked with providing a new interface for my company's Google Search Appliance. I am using an ASP.NET page, which does some things like choose a Collection depending on where a user is located, etc. and then sends the appropriate search string the the GSA. This was all working well until they decided to turn authentication on, and now I can't get any results (I either get a 401 unauthorized, or a message stating that 'Data at the root level is invalid'). If I take the search string and provide it directly to the GSA, it authenticates fine, and displays the results, I just can't seem to get it through the HttpWebRequest.
EDIT 2: I did a little more looking (ran the request through Fiddler) and it looks like the request is only attempting Negotiate and not Kerberos. I set the credentials to use Kerberos explicitly as below, but it didn't help...
public XPathNavigator GSASearch(string url, string searchString)
{
CredentialCache credCache = new CredentialCache();
credCache.Add(new Uri(url), "Kerberos", CredentialCache.DefaultNetworkCredentials);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url + searchString);
request.CookieContainer = new CookieContainer();
request.PreAuthenticate = true;
request.Credentials = credCache;
request.ContentType = "text/xml";
request.Method = "POST";
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream receiveStream = response.GetResponseStream();
//StreamReader readStream = new StreamReader(receiveStream);
XPathDocument doc = new XPathDocument(receiveStream);
return doc.CreateNavigator();
}
EDIT 3: Ok, looking closer again, the CredentialCache.DefaultCredentials doesn't appear to have my network credentials in it...

1) Have you done a wireshark trace of a successful session to the GSA using the browser? Does that work?
2) If #1 works, what is the WWW-Authenticate header that is sent by the GSA on the first unauthenticated request?
3) Is the machine on which the ASPX app is running a part of the same AD domain that the GSA is in? AFAIK this is probably required for a successful auth.
4) Next, since it is the ASPX app that is doing the request, you cannot use the DefaultCredentials because you actually need the credentials of a user that is trusted by the GSA. For this you should either create a special user account for the app that is talking to the GSA, or have each user be a trusted user on the GSA and have the ASPX page authenticate the user first, then pass those credentials to the GDA using Delegation. For this you will also have to make the server running the ASPX app trusted for delegation.
In my opinion, you should first model your code into a console app that you run, and debug. Then port it to ASPX page. That way you will be able to know if the failure is due to the host (ASPX vs console) or something else.

Related

Do I need to investigate HTTP 401 cached by Fiddler during the HttpWebRequest post to ASP.NET WebApi server

I am totally new to WebApi and WebRequests and other things.
After hours of googling, finally, I managed to do POST using C# and HttpWebRequest.
When I do HttpWebRequest in debug mode using Visual Studio I do not get any exceptions.
My app work as I accept , I get data to webApi server and also get back data.
To be sure how my app communicate with WebApi server I start Fiddler Web Debugger.
During the POST to WebApi, Fiddler chace 401 errors
{"Message":"Authorization has been denied for this request."}
Steping step by step in debuger I fund that following lines of code doing 401 error
HttpWebRequest wr = (HttpWebRequest)WebRequest.Create(url);
wr.Credentials = new NetworkCredential(username, password);
wr.Method = "POST";
wr.ContentType = "application/json";
byte[] byteArray = System.Text.Encoding.UTF8.GetBytes(body);
wr.ContentLength = byteArray.Length;
using (System.IO.Stream dataStream = wr.GetRequestStream())
{
dataStream.Write(byteArray, 0, byteArray.Length); //After this line of code Fidler Chace HTTP 401
}
Later in code when I do wr.GetResponse() I do get status 200OK.
My questions are :
Do I need to redesign my code to avoid this error in Fiddler ?
Is there other methods to fill HttpWebRequest whit jsonSting beside using GetRequestStream() ?
If your service is enabled with Windows Authentcation, then in Fiddler, you can select the option to automatically authenticate using your logged on credentials by going here:
Composer tab -> Options tab -> Automatically Authenticate
Also, why not use HttpClient from System.Net.Http?...It has a much better and easy programming model...example:
HttpClientHandler handler = new HttpClientHandler();
handler.UseDefaultCredentials = true;
HttpClient client = new HttpClient(handler);
client.BaseAddress = new Uri("http://localhost:9095/");
HttpResponseMessage response = client.PostAsJsonAsync<Customer>("api/values", cust).Result;

Executing a webrequest without redirecting a page

I ran into a weird problem while using openid in asp.net. I wanted a server side logout for gmail account but without redirecting to another page.
I thought executing a web request would do that. This is my code
HttpWebRequest loHttp =
(HttpWebRequest)WebRequest.Create("https://www.google.com/accounts/Logout");
// *** Set properties
loHttp.Timeout = 10000; // 10 secs
loHttp.UserAgent = "Code Sample Web Client";
// *** Retrieve request info headers
HttpWebResponse loWebResponse = (HttpWebResponse)loHttp.GetResponse();
Encoding enc = Encoding.GetEncoding(1252); // Windows default Code Page
StreamReader loResponseStream =
new StreamReader(loWebResponse.GetResponseStream(), enc);
string lcHtml = loResponseStream.ReadToEnd();
loWebResponse.Close();
loResponseStream.Close();
But it doesn't seem to work. The gmail account is still signed in.
Is it possible to execute a webrequest with such URL?
Thanks
I think that's because HttpWebRequest is made at the server level and you are logged in in the client.
You should use an iframe to load the URL

ASP.Net Web API - Authorization header blank

I am having to re-write an existing REST API using .NET (originally written with Ruby). From the client's perspective, it has to work exactly the same way as the old API - i.e. the client code mustn't need to change. The current API requires Basic Authentication. So to call the old API, the following works perfectly:-
var wc = new System.Net.WebClient();
var myCache = new CredentialCache();
myCache.Add(new Uri(url), "Basic", new NetworkCredential("XXX", "XXX"));
wc.Credentials = myCache;
var returnBytes = wc.DownloadData("http://xxxx");
(I have had to ommit the real URL / username / password etc for security reasons).
Now I am writing the new API using ASP.Net Web API with MVC4. I have a weird problem and cannot find anybody else with exactly the same problem. In order to support Basic Authentication, I have followed the guidelines here:
http://sixgun.wordpress.com/2012/02/29/asp-net-web-api-basic-authentication/
One thing, I put the code to "hook in the handler" in the Global.asax.cs file in the Application_Start() event (that wasn't explained so I guessed).
Anyway, if I call my API (which I have deployed in IIS) using the above code, the Authorization header is always null, and the above fails with 401 Unauthorized. However, if I manually set the header using this code, it works fine - i.e. the Authorization header now exists and I am able to Authenticate the user.
private void SetBasicAuthHeader(WebClient request, String userName, String userPassword)
{
string authInfo = userName + ":" + userPassword;
authInfo = Convert.ToBase64String(Encoding.Default.GetBytes(authInfo));
request.Headers["Authorization"] = "Basic " + authInfo;
}
.......
var wc = new System.Net.WebClient();
SetBasicAuthHeader(request, "XXXX", "XXXX");
var returnBytes = wc.DownloadData("http://xxxx");
Although that works, it's no good to me because existing users of the existing API are not going to be manually setting the header.
Reading up on how Basic Authentication works, the initial request is meant to be anonymous, then the client is returned 401, then the client is meant to try again. However if I put a break point in my code, it will never hit the code again in Antony's example. I was expecting my breakpoint to be hit twice.
Any ideas how I can get this to work?
You're expecting the right behavior. System.Net.WebClient does not automatically include the Authorization headers upon initial request. It only sends them when properly challenged by a response, which to my knowledge is a 401 status code and a proper WWW-Authenticate header. See here and here for further info.
I'm assuming your basic authentication handler is not returning the WWW-Authenticate header and as such WebClient never even attempts to send the credentials on a second request. You should be able to watch this in Fiddler or a similar tool.
If your handler did something like this, you should witness the WebClient approach working:
//if is not authenticated or Authorization header is null
return base.SendAsync(request, cancellationToken).ContinueWith(task =>
{
var response = task.Result;
response.StatusCode = HttpStatusCode.Unauthorized;
response.Headers.Add("WWW-Authenticate", "Basic realm=\"www.whatever.com\"");
return response;
});
//else (is authenticated)
return base.SendAsync(request, cancellationToken);
As you noticed, if you include the Authorization headers on every request (like you did in your alternative approach) then your handler already works as is. So it may be sufficient - it just isn't for WebClient and other clients that operate in the same way.

Open website with Basic authentication by asp.net

I want to open a website on a button click of asp.net web page. The website have basic authentication to open. I am using following code but its not the desired code to open website.
CredentialCache credentialCache = new CredentialCache();
NetworkCredential credentials = new NetworkCredential("uid", "pwd", "domain");
credentialCache.Add(new Uri("https://www.abc.com"), "Basic", credentials);
var request = (HttpWebRequest)WebRequest.Create("https://www.abc.com");
request.Credentials = credentialCache;
HttpWebResponse res1 = (HttpWebResponse)request.GetResponse();
using (HttpWebResponse res = (HttpWebResponse)request.GetResponse())
{
string ss = res.ResponseUri.ToString();
StreamReader sr = new StreamReader(res.GetResponseStream());
Response.Write (sr.ReadToEnd());
}
Response.Write write the html of the requested page on my page but I need to redirect to that page to access the website.
After a long search and R & D finally i got the solution of this problem. We need to add the reference of a COM Component Microsoft Internet Controls i.e. using SHDocVw;
When we try to open any website which requires the basic authentication then server respond with "HTTP/1.1 401 Unauthorized\r\n" and requires the client credentials and sends the header "www_authenticate" to the client when any browser recieve this header then its open the pop up box to insert the client credentials and we pass the client credentials to the server. Then server allows to access the application.
If we pass the client credentials on the first request to the server then it allows to access the application and don't requires the client credentials.
I have passed the client credentials with the following code .
InternetExplorer IEControl = new InternetExplorer();
IWebBrowserApp webBrowserCtl = (IWebBrowserApp)IEControl;
string strUserName = "UserName";
string strPassword = "Password";
string strAuthenticationHeader = "Authorization: Basic " + Convert.ToBase64String(Encoding.Default.GetBytes(strUserName + ":" + strPassword)) + "\r\n";
webBrowserCtl.Visible = true;
webBrowserCtl.Navigate("http://www.abc.com/Home.aspx", null, null, null, strAuthenticationHeader);
Thanks
Rahul Pratap Singh

How can I make an HTTPWebRequest to an ASP.NET web service and appear that I'm logged into that domain?

I have a web service that I can only hit if I'm logged into the website that the web service is on. I need to test the service remotely. So I've written some code to create a fake session that is logged into the site in another browser. Then I made the HTTP Web Request and I'm attempting to set a cookie that contains the ASP.NET session ID of that logged in user. But the web service doesn't detect that the web request is a logged in user or session. What do I need to give the web service to convince it this is a valid session?
// used on each read operation
byte[] buf = new byte[8192];
CookieContainer myContainer = new CookieContainer();
string sessionID = SessionID;
myContainer.Add(new Cookie("ASP.NET_SessionId", sessionID, "/", WebsiteUrl));
// prepare the web page we will be asking for
HttpWebRequest request = (HttpWebRequest)
WebRequest.Create(CreateWebserviceUrl("doneScreenScore"));
request.ContentType = "text/xml; charset=utf-8";
request.Method = "POST";
request.Accept = "text/xml";
request.Headers.Add("SOAPAction", "http://detectent.net/doneScreenScore");
request.CookieContainer = myContainer;
Stream s = request.GetRequestStream();
string soaprequest = "";
soaprequest += "<?xml version=\"1.0\" encoding=\"utf-8\"?>";
soaprequest += "<soap12:Envelope xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:soap12=\"http://www.w3.org/2003/05/soap-envelope\">";
soaprequest += " <soap12:Body>";
soaprequest += " <doneScreenScore xmlns=\"http://detectent.net/\">";
soaprequest += " <input1>string</input1>";
soaprequest += "<input2>string</input2>";
soaprequest += "<input3>string</input3>";
soaprequest += "<input4>string</input4>";
soaprequest += "</doneScreenScore>";
soaprequest += "</soap12:Body>";
soaprequest += "</soap12:Envelope>";
s.Write(System.Text.Encoding.ASCII.GetBytes(soaprequest), 0, soaprequest.Length);
s.Close();
// execute the request
HttpWebResponse response = (HttpWebResponse)
request.GetResponse();
// we will read data via the response stream
Stream resStream = response.GetResponseStream();
string responseFromWebServiceCall = ReadResponseStream(resStream);
I think that you can do something with the credentials... I am doing this for a webservice...
rpt.Credentials = new System.Net.NetworkCredential(UserName, Password, Domain);
Without actually messing around with the thing myself I can't tell you exactly why it's not working. However I've done a bit of work writing applications which masquerade as a typical user for testing purposes and a tool I would recommend you familiarize yourself with is called WireShark. (Previously known as Ethereal)
WireShark is a packet sniffer. If you run it while browsing a website normally you will be able to see all the data being sent back and forth between your browser and the web server. Once you've found the information you need to make a request to this web service you should have no trouble constructing a proper WebRequest which behaves in the same manner.
You can download WireShark from the official site.
The NetworkCredentials work if the web site is using windows authentication.
I've had varying success impersonating a browser when doing this type of thing. Usually I have to make a valid request to their login page first. The response will contain a cookie with the SessionID. As long as I reuse the same CookieCollection for all the requests after that (ie. One global CookieCollection that is reused every request) it will work fine.
On the odd occasion, I still find that a website won't recognize my newly logged in session. In which case, I manually login with FireFox and use the WebDeveloper toolbar to view the cookies. I grab the cookie name/value and put them in the app settings, which is used by the program to populate the cookie collection.
Make sure you spoof a valid useragent too. I've had sites deny me because of that too.

Resources