Executing a webrequest without redirecting a page - asp.net

I ran into a weird problem while using openid in asp.net. I wanted a server side logout for gmail account but without redirecting to another page.
I thought executing a web request would do that. This is my code
HttpWebRequest loHttp =
(HttpWebRequest)WebRequest.Create("https://www.google.com/accounts/Logout");
// *** Set properties
loHttp.Timeout = 10000; // 10 secs
loHttp.UserAgent = "Code Sample Web Client";
// *** Retrieve request info headers
HttpWebResponse loWebResponse = (HttpWebResponse)loHttp.GetResponse();
Encoding enc = Encoding.GetEncoding(1252); // Windows default Code Page
StreamReader loResponseStream =
new StreamReader(loWebResponse.GetResponseStream(), enc);
string lcHtml = loResponseStream.ReadToEnd();
loWebResponse.Close();
loResponseStream.Close();
But it doesn't seem to work. The gmail account is still signed in.
Is it possible to execute a webrequest with such URL?
Thanks

I think that's because HttpWebRequest is made at the server level and you are logged in in the client.
You should use an iframe to load the URL

Related

Do I need to investigate HTTP 401 cached by Fiddler during the HttpWebRequest post to ASP.NET WebApi server

I am totally new to WebApi and WebRequests and other things.
After hours of googling, finally, I managed to do POST using C# and HttpWebRequest.
When I do HttpWebRequest in debug mode using Visual Studio I do not get any exceptions.
My app work as I accept , I get data to webApi server and also get back data.
To be sure how my app communicate with WebApi server I start Fiddler Web Debugger.
During the POST to WebApi, Fiddler chace 401 errors
{"Message":"Authorization has been denied for this request."}
Steping step by step in debuger I fund that following lines of code doing 401 error
HttpWebRequest wr = (HttpWebRequest)WebRequest.Create(url);
wr.Credentials = new NetworkCredential(username, password);
wr.Method = "POST";
wr.ContentType = "application/json";
byte[] byteArray = System.Text.Encoding.UTF8.GetBytes(body);
wr.ContentLength = byteArray.Length;
using (System.IO.Stream dataStream = wr.GetRequestStream())
{
dataStream.Write(byteArray, 0, byteArray.Length); //After this line of code Fidler Chace HTTP 401
}
Later in code when I do wr.GetResponse() I do get status 200OK.
My questions are :
Do I need to redesign my code to avoid this error in Fiddler ?
Is there other methods to fill HttpWebRequest whit jsonSting beside using GetRequestStream() ?
If your service is enabled with Windows Authentcation, then in Fiddler, you can select the option to automatically authenticate using your logged on credentials by going here:
Composer tab -> Options tab -> Automatically Authenticate
Also, why not use HttpClient from System.Net.Http?...It has a much better and easy programming model...example:
HttpClientHandler handler = new HttpClientHandler();
handler.UseDefaultCredentials = true;
HttpClient client = new HttpClient(handler);
client.BaseAddress = new Uri("http://localhost:9095/");
HttpResponseMessage response = client.PostAsJsonAsync<Customer>("api/values", cust).Result;

Open website with Basic authentication by asp.net

I want to open a website on a button click of asp.net web page. The website have basic authentication to open. I am using following code but its not the desired code to open website.
CredentialCache credentialCache = new CredentialCache();
NetworkCredential credentials = new NetworkCredential("uid", "pwd", "domain");
credentialCache.Add(new Uri("https://www.abc.com"), "Basic", credentials);
var request = (HttpWebRequest)WebRequest.Create("https://www.abc.com");
request.Credentials = credentialCache;
HttpWebResponse res1 = (HttpWebResponse)request.GetResponse();
using (HttpWebResponse res = (HttpWebResponse)request.GetResponse())
{
string ss = res.ResponseUri.ToString();
StreamReader sr = new StreamReader(res.GetResponseStream());
Response.Write (sr.ReadToEnd());
}
Response.Write write the html of the requested page on my page but I need to redirect to that page to access the website.
After a long search and R & D finally i got the solution of this problem. We need to add the reference of a COM Component Microsoft Internet Controls i.e. using SHDocVw;
When we try to open any website which requires the basic authentication then server respond with "HTTP/1.1 401 Unauthorized\r\n" and requires the client credentials and sends the header "www_authenticate" to the client when any browser recieve this header then its open the pop up box to insert the client credentials and we pass the client credentials to the server. Then server allows to access the application.
If we pass the client credentials on the first request to the server then it allows to access the application and don't requires the client credentials.
I have passed the client credentials with the following code .
InternetExplorer IEControl = new InternetExplorer();
IWebBrowserApp webBrowserCtl = (IWebBrowserApp)IEControl;
string strUserName = "UserName";
string strPassword = "Password";
string strAuthenticationHeader = "Authorization: Basic " + Convert.ToBase64String(Encoding.Default.GetBytes(strUserName + ":" + strPassword)) + "\r\n";
webBrowserCtl.Visible = true;
webBrowserCtl.Navigate("http://www.abc.com/Home.aspx", null, null, null, strAuthenticationHeader);
Thanks
Rahul Pratap Singh

ASP.NET HttpWebRequest with Kerberos Authentication

I am trying to connect to a web service that uses Kerberos Authentication to authorize the user, but all I get is a 401 unauthorized everytime I try to make the request. Below is the code that I am using. Thanks in advance for any help you can provide!
public XPathNavigator GSASearch(string url, string searchString)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url + searchString);
request.CookieContainer = new CookieContainer();
request.Credentials = CredentialCache.DefaultCredentials;
request.ContentType = "text/xml";
request.Method = "POST";
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream receiveStream = response.GetResponseStream();
XPathDocument doc = new XPathDocument(receiveStream);
return doc.CreateNavigator();
}
EDIT: I feel I should explain a bit more what I am attempting to do. I have been tasked with providing a new interface for my company's Google Search Appliance. I am using an ASP.NET page, which does some things like choose a Collection depending on where a user is located, etc. and then sends the appropriate search string the the GSA. This was all working well until they decided to turn authentication on, and now I can't get any results (I either get a 401 unauthorized, or a message stating that 'Data at the root level is invalid'). If I take the search string and provide it directly to the GSA, it authenticates fine, and displays the results, I just can't seem to get it through the HttpWebRequest.
EDIT 2: I did a little more looking (ran the request through Fiddler) and it looks like the request is only attempting Negotiate and not Kerberos. I set the credentials to use Kerberos explicitly as below, but it didn't help...
public XPathNavigator GSASearch(string url, string searchString)
{
CredentialCache credCache = new CredentialCache();
credCache.Add(new Uri(url), "Kerberos", CredentialCache.DefaultNetworkCredentials);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url + searchString);
request.CookieContainer = new CookieContainer();
request.PreAuthenticate = true;
request.Credentials = credCache;
request.ContentType = "text/xml";
request.Method = "POST";
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream receiveStream = response.GetResponseStream();
//StreamReader readStream = new StreamReader(receiveStream);
XPathDocument doc = new XPathDocument(receiveStream);
return doc.CreateNavigator();
}
EDIT 3: Ok, looking closer again, the CredentialCache.DefaultCredentials doesn't appear to have my network credentials in it...
1) Have you done a wireshark trace of a successful session to the GSA using the browser? Does that work?
2) If #1 works, what is the WWW-Authenticate header that is sent by the GSA on the first unauthenticated request?
3) Is the machine on which the ASPX app is running a part of the same AD domain that the GSA is in? AFAIK this is probably required for a successful auth.
4) Next, since it is the ASPX app that is doing the request, you cannot use the DefaultCredentials because you actually need the credentials of a user that is trusted by the GSA. For this you should either create a special user account for the app that is talking to the GSA, or have each user be a trusted user on the GSA and have the ASPX page authenticate the user first, then pass those credentials to the GDA using Delegation. For this you will also have to make the server running the ASPX app trusted for delegation.
In my opinion, you should first model your code into a console app that you run, and debug. Then port it to ASPX page. That way you will be able to know if the failure is due to the host (ASPX vs console) or something else.

HTML scraping: Forms authentication failed for the request. The ticket supplied has expired

The ActiveForums module we're using as part of our DotNetNuke system has a bug in the XML for it's RSS feed. It doesn't correctly encode ampersands, it leaves them as & rather than encoding them as &
I've reported the bug to the company, but in the mean time I need a fix. So what I've done is create an intermediary page that makes a request to the RSS feed via a System.Net.HttpWebRequest.Create(url) and them performs a Regex.Replace to replaces any unencoded ampersands.
The problem is that when I run the code on our production server I get an exception: The remote server returned an error: (500) Internal Server Error.
The only reason I could think of was around authentication (As the server requires NTLM), however as far as I can tell I'm doing this part of it correctly. My code is shown below:
string html = string.Empty;
string url = "http://intranet.nt.avs/dnn/Default.aspx?tabid=130";
WebResponse response;
WebRequest request = System.Net.HttpWebRequest.Create(url);
request.PreAuthenticate = true;
request.Credentials = System.Net.CredentialCache.DefaultCredentials;
response = request.GetResponse();
using (StreamReader sr = new StreamReader(response.GetResponseStream()) )
{
html = sr.ReadToEnd();
}
// Clean invalid XML
html = Regex.Replace( html, "&(?!amp;|gt;|lt;|quot;|apos;)", "&", RegexOptions.Multiline | RegexOptions.IgnoreCase );
Response.ContentType = "text/xml";
Response.Write( html );
Updated: Here's what the event log says
Error code: 4005
Event message: Forms authentication failed for the request. Reason: The ticket supplied has expired

How can I make an HTTPWebRequest to an ASP.NET web service and appear that I'm logged into that domain?

I have a web service that I can only hit if I'm logged into the website that the web service is on. I need to test the service remotely. So I've written some code to create a fake session that is logged into the site in another browser. Then I made the HTTP Web Request and I'm attempting to set a cookie that contains the ASP.NET session ID of that logged in user. But the web service doesn't detect that the web request is a logged in user or session. What do I need to give the web service to convince it this is a valid session?
// used on each read operation
byte[] buf = new byte[8192];
CookieContainer myContainer = new CookieContainer();
string sessionID = SessionID;
myContainer.Add(new Cookie("ASP.NET_SessionId", sessionID, "/", WebsiteUrl));
// prepare the web page we will be asking for
HttpWebRequest request = (HttpWebRequest)
WebRequest.Create(CreateWebserviceUrl("doneScreenScore"));
request.ContentType = "text/xml; charset=utf-8";
request.Method = "POST";
request.Accept = "text/xml";
request.Headers.Add("SOAPAction", "http://detectent.net/doneScreenScore");
request.CookieContainer = myContainer;
Stream s = request.GetRequestStream();
string soaprequest = "";
soaprequest += "<?xml version=\"1.0\" encoding=\"utf-8\"?>";
soaprequest += "<soap12:Envelope xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:soap12=\"http://www.w3.org/2003/05/soap-envelope\">";
soaprequest += " <soap12:Body>";
soaprequest += " <doneScreenScore xmlns=\"http://detectent.net/\">";
soaprequest += " <input1>string</input1>";
soaprequest += "<input2>string</input2>";
soaprequest += "<input3>string</input3>";
soaprequest += "<input4>string</input4>";
soaprequest += "</doneScreenScore>";
soaprequest += "</soap12:Body>";
soaprequest += "</soap12:Envelope>";
s.Write(System.Text.Encoding.ASCII.GetBytes(soaprequest), 0, soaprequest.Length);
s.Close();
// execute the request
HttpWebResponse response = (HttpWebResponse)
request.GetResponse();
// we will read data via the response stream
Stream resStream = response.GetResponseStream();
string responseFromWebServiceCall = ReadResponseStream(resStream);
I think that you can do something with the credentials... I am doing this for a webservice...
rpt.Credentials = new System.Net.NetworkCredential(UserName, Password, Domain);
Without actually messing around with the thing myself I can't tell you exactly why it's not working. However I've done a bit of work writing applications which masquerade as a typical user for testing purposes and a tool I would recommend you familiarize yourself with is called WireShark. (Previously known as Ethereal)
WireShark is a packet sniffer. If you run it while browsing a website normally you will be able to see all the data being sent back and forth between your browser and the web server. Once you've found the information you need to make a request to this web service you should have no trouble constructing a proper WebRequest which behaves in the same manner.
You can download WireShark from the official site.
The NetworkCredentials work if the web site is using windows authentication.
I've had varying success impersonating a browser when doing this type of thing. Usually I have to make a valid request to their login page first. The response will contain a cookie with the SessionID. As long as I reuse the same CookieCollection for all the requests after that (ie. One global CookieCollection that is reused every request) it will work fine.
On the odd occasion, I still find that a website won't recognize my newly logged in session. In which case, I manually login with FireFox and use the WebDeveloper toolbar to view the cookies. I grab the cookie name/value and put them in the app settings, which is used by the program to populate the cookie collection.
Make sure you spoof a valid useragent too. I've had sites deny me because of that too.

Resources