In the application I am currently working on there is a backend java app that is caching a bunch of data. The asp.net part is allowing users to update database tables. Each time the DB is updated the cache in the java application should be cleared. So basically I have a list of 4 URLs that each need to be hit in order to clear the cache. My basic solution was to loop through each url and create a HttpWebRequest and get then get the response. So basically I have this for each request:
var request = (HttpWebRequest)WebRequest.Create(url);
request.Method = "POST";
request.ContentLength = 0;
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
Stream receiveStream = response.GetResponseStream();
StreamReader readStream = new StreamReader(receiveStream, Encoding.UTF8);
string responseString = readStream.ReadToEnd();
returnList.Add(string.Format("Refresh response from {0}.<br />{1}", url, responseString));
readStream.Close();
receiveStream.Close();
}
On my local machine everything works great. But when I deploy to our development server it just hangs and does nothing. If I remove request.ContentLength = 0; then the remote server throws a 411: Length expected error.
I am really stuck here and any help would be greatly appreciated.
Either a solution to the HttpWebRequest problem I am having or a different solution to calling each URL would work, I'm not picky.
Thanks in advance.
Why are using request.method as "POST"? Are you posting any data, if not try removing both content length and request method.
Pretty sure this was a network issue. I tried hitting a different url (the load balancer) and had no problems so the java guys are making a changes so I can just hit the load balancer and whatever server the request ends up on will make sure all servers caches are cleared.
The code that is working:
var request = (HttpWebRequest)WebRequest.Create(url);
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
Stream receiveStream = response.GetResponseStream();
StreamReader readStream = new StreamReader(receiveStream, Encoding.UTF8);
string responseString = readStream.ReadToEnd();
returnString = string.Format(#"Refresh response from<br />{0}{1}", url, responseString);
readStream.Close();
receiveStream.Close();
}
Related
Im setting up a project on a new dev machine. (Win7, vs2k12)
The code works fine on the production machine.
This code gives me "The operation has timed out".
var request = (HttpWebRequest) System.Net.WebRequest.Create(url);
request.Timeout = TimeoutMilliSeconds;
request.Method = "GET";
request.UserAgent = "X";
var response = (HttpWebResponse)request.GetResponse();
The url works fine when I call it from the browser.
I know which ports are open for incoming traffic.
Im using IIS express.
How do I proceed?
Thanks
Set the ReadWriteTimeout property.For details see a link:
http://msdn.microsoft.com/en-us/library/system.net.httpwebrequest.readwritetimeout.aspx
request.ReadWriteTimeout = 5000; //Increase Time as your requirement.
I am totally new to WebApi and WebRequests and other things.
After hours of googling, finally, I managed to do POST using C# and HttpWebRequest.
When I do HttpWebRequest in debug mode using Visual Studio I do not get any exceptions.
My app work as I accept , I get data to webApi server and also get back data.
To be sure how my app communicate with WebApi server I start Fiddler Web Debugger.
During the POST to WebApi, Fiddler chace 401 errors
{"Message":"Authorization has been denied for this request."}
Steping step by step in debuger I fund that following lines of code doing 401 error
HttpWebRequest wr = (HttpWebRequest)WebRequest.Create(url);
wr.Credentials = new NetworkCredential(username, password);
wr.Method = "POST";
wr.ContentType = "application/json";
byte[] byteArray = System.Text.Encoding.UTF8.GetBytes(body);
wr.ContentLength = byteArray.Length;
using (System.IO.Stream dataStream = wr.GetRequestStream())
{
dataStream.Write(byteArray, 0, byteArray.Length); //After this line of code Fidler Chace HTTP 401
}
Later in code when I do wr.GetResponse() I do get status 200OK.
My questions are :
Do I need to redesign my code to avoid this error in Fiddler ?
Is there other methods to fill HttpWebRequest whit jsonSting beside using GetRequestStream() ?
If your service is enabled with Windows Authentcation, then in Fiddler, you can select the option to automatically authenticate using your logged on credentials by going here:
Composer tab -> Options tab -> Automatically Authenticate
Also, why not use HttpClient from System.Net.Http?...It has a much better and easy programming model...example:
HttpClientHandler handler = new HttpClientHandler();
handler.UseDefaultCredentials = true;
HttpClient client = new HttpClient(handler);
client.BaseAddress = new Uri("http://localhost:9095/");
HttpResponseMessage response = client.PostAsJsonAsync<Customer>("api/values", cust).Result;
Environment: ASP.Net MVC 4 using C#
I need to get image by using GET request to a URL /inbound/faxes/{id}/image
I used the code below
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("/inbound/faxes/238991717/image");
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
System.IO.StreamReader stream = new StreamReader(response.GetResponseStream());
but it flags "URL not valid"
I used the complete URL www.interfax.net/inbound/faxes/{id}/image
but the result is same
I want to follow this article to receive faxes
Accepting incoming fax notifications by callback
Can anyone help me to get fax...?
Try like this:
using (var client = new WebClient())
{
byte[] imageData = client.DownloadData("http://www.interfax.net/inbound/faxes/{id}/image");
}
Notice how the url is prefixed with the protocol (HTTP in this case). Also make sure you have replaced the {id} part of the url with the actual id of the image you are trying to retrieve.
I am trying to connect to a web service that uses Kerberos Authentication to authorize the user, but all I get is a 401 unauthorized everytime I try to make the request. Below is the code that I am using. Thanks in advance for any help you can provide!
public XPathNavigator GSASearch(string url, string searchString)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url + searchString);
request.CookieContainer = new CookieContainer();
request.Credentials = CredentialCache.DefaultCredentials;
request.ContentType = "text/xml";
request.Method = "POST";
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream receiveStream = response.GetResponseStream();
XPathDocument doc = new XPathDocument(receiveStream);
return doc.CreateNavigator();
}
EDIT: I feel I should explain a bit more what I am attempting to do. I have been tasked with providing a new interface for my company's Google Search Appliance. I am using an ASP.NET page, which does some things like choose a Collection depending on where a user is located, etc. and then sends the appropriate search string the the GSA. This was all working well until they decided to turn authentication on, and now I can't get any results (I either get a 401 unauthorized, or a message stating that 'Data at the root level is invalid'). If I take the search string and provide it directly to the GSA, it authenticates fine, and displays the results, I just can't seem to get it through the HttpWebRequest.
EDIT 2: I did a little more looking (ran the request through Fiddler) and it looks like the request is only attempting Negotiate and not Kerberos. I set the credentials to use Kerberos explicitly as below, but it didn't help...
public XPathNavigator GSASearch(string url, string searchString)
{
CredentialCache credCache = new CredentialCache();
credCache.Add(new Uri(url), "Kerberos", CredentialCache.DefaultNetworkCredentials);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url + searchString);
request.CookieContainer = new CookieContainer();
request.PreAuthenticate = true;
request.Credentials = credCache;
request.ContentType = "text/xml";
request.Method = "POST";
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream receiveStream = response.GetResponseStream();
//StreamReader readStream = new StreamReader(receiveStream);
XPathDocument doc = new XPathDocument(receiveStream);
return doc.CreateNavigator();
}
EDIT 3: Ok, looking closer again, the CredentialCache.DefaultCredentials doesn't appear to have my network credentials in it...
1) Have you done a wireshark trace of a successful session to the GSA using the browser? Does that work?
2) If #1 works, what is the WWW-Authenticate header that is sent by the GSA on the first unauthenticated request?
3) Is the machine on which the ASPX app is running a part of the same AD domain that the GSA is in? AFAIK this is probably required for a successful auth.
4) Next, since it is the ASPX app that is doing the request, you cannot use the DefaultCredentials because you actually need the credentials of a user that is trusted by the GSA. For this you should either create a special user account for the app that is talking to the GSA, or have each user be a trusted user on the GSA and have the ASPX page authenticate the user first, then pass those credentials to the GDA using Delegation. For this you will also have to make the server running the ASPX app trusted for delegation.
In my opinion, you should first model your code into a console app that you run, and debug. Then port it to ASPX page. That way you will be able to know if the failure is due to the host (ASPX vs console) or something else.
I'am using the WebRequest class in .net and POST data to a server which is responding with a Response.
The wierd thing is that its working when I started fiddler to analyze my network traffic, but without fiddler it isn't.
So i started to analyze the package which is sent to and from my computer with WireShark. With in this program its simple to follow the TCP-stream. So when I had fiddler on, I can see the correct Request-header/body is sent, and gets the Response-header/body. The strange part is when i dont use fiddler the Request-header is sent, then i´ve got the Response-header/body, and finally the request-body in the end of the TCP-stream.
Here is my code i've been elaborating:
string lcUrl = "http://XX.XX.XXX.XX";
// *** Establish the request
HttpWebRequest loHttp = (HttpWebRequest)WebRequest.Create(lcUrl);
string lcPostData = testdata;
loHttp.Method = "POST";
byte [] lbPostBuffer = System.Text.Encoding.GetEncoding(1252).GetBytes(lcPostData);
loHttp.ContentLength = lbPostBuffer.Length;
loHttp.Credentials = CredentialCache.DefaultCredentials;
//loHttp.SendChunked = true;
loHttp.ServicePoint.Expect100Continue = false;
Stream loPostData = loHttp.GetRequestStream();
loPostData.Write(lbPostBuffer, 0, lbPostBuffer.Length);
loPostData.Close();
HttpWebResponse loWebResponse = (HttpWebResponse)loHttp.GetResponse();
Encoding enc = System.Text.Encoding.GetEncoding(1252);
StreamReader loResponseStream = new StreamReader(loWebResponse.GetResponseStream(), enc);
string lcHtml = loResponseStream.ReadToEnd();
loWebResponse.Close();
loResponseStream.Close();
Please use following code. Seems that you have problems with time when underlying stream are send to remote server.
string lcUrl = "http://XX.XX.XXX.XX";
// *** Establish the request
HttpWebRequest loHttp = (HttpWebRequest)WebRequest.Create(lcUrl);
string lcPostData = testdata;
loHttp.Method = "POST";
byte[] lbPostBuffer = System.Text.Encoding.GetEncoding(1252).GetBytes(lcPostData);
loHttp.ContentLength = lbPostBuffer.Length;
loHttp.Credentials = CredentialCache.DefaultCredentials;
//loHttp.SendChunked = true;
loHttp.ServicePoint.Expect100Continue = false;
using (Stream loPostData = loHttp.GetRequestStream())
{
loPostData.Write(lbPostBuffer, 0, lbPostBuffer.Length);
}
string lcHtml;
using (HttpWebResponse loWebResponse = (HttpWebResponse)loHttp.GetResponse())
{
Encoding enc = System.Text.Encoding.GetEncoding(1252);
using (StreamReader loResponseStream = new StreamReader(loWebResponse.GetResponseStream(), enc))
{
lcHtml = loResponseStream.ReadToEnd();
}
}
// Perform processing of data here....
Also I could suggest you add following code in the app.config file for your application. This is helps when server returns response that not conforms with way how .NET handle HTTP request.
<configuration>
<system.net>
<settings>
<httpWebRequest
useUnsafeHeaderParsing="true"
/>
</settings>
</system.net>
</configuration>
I have a suspicion that the client is waiting for the "HTTP/1.1 100 continue" response from the server. This is how it works. When you are posting data to the server, sometimes the server might not be ready to accept the data just yet. For eg, it wants to authenticate the client first.
So, when you send a POST request, the client just sends the request headers, with an "Expect: 100-continue" appended.
POST /url HTTP/1.1
Server: Server-name/fqdn
Content-Length: 100
Expect: 100-continue
If the server is ready to receive the data it responds with:
HTTP/1.1 100 continue
Server: server-name/fqdn
Now, the client can send the data.
However if the server is not ready to receive the data, and wants to authenticate the client, it will respond with a different status code.
If you post your wireshark trace to pastebin.com I can verify, but I suspect this is what is happening.
The reason you dont see this in fiddler might be that fiddler is using HttpListener to listen to HTTP request, and HTTP listener hides the intermediate response like 100-continue from the app (in this case fiddler).