Sending Requests To Mobile Sites Using HttpClient - http

I am new to Task-based programming and this new HttpClient class, but I read the examples and documentation on the MSDN and have a basic understanding of both. I tried to create a basic application that sends an async request, but it has already failed. It seems to be a problem with the URL, but have a look at the code first:
public static async void ScrapeDailyRaces()
{
HttpClient httpClient = new HttpClient();
Stream myStream = await httpClient.GetStreamAsync("https://mobile.bet365.com/");
}
When I tried to replace the URL with http://www.google.com, and also https://www.google.com, but they both worked so it isn't a problem with https. I also tried adding www to the faulty URL, resulting in https://www.mobile.bet365.com/, but it still doesn't work. Any ideas?
Exception details: "The underlying connection was closed: An unexpected error occurred on a receive." and "An error occurred while sending the request."

Just in case someone ran into the same problem, I fixed it by adding a user agent to the request using the following:
httpClient.DefaultRequestHeaders.Add("user-agent", "Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.2; WOW64; Trident/6.0)");
Hope this helps.

Related

HttpWebRequest throws 404 error in .NET Core 2.2 but works in 3.0

I found out that making a HTTP GET request works in a simple .NET Core console application like this:
static void Main(string[] args) {
string url = "https://www.zdnet.com/article/quantum-entanglement-breakthrough-could-boost-encryption-secure-communications/#ftag=RSSbaffb68";
var req = (HttpWebRequest)WebRequest.Create(url);
req.UserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:74.0) Gecko/20100101 Firefox/74.0";
var resp = (HttpWebResponse)req.GetResponse();
var respStream = new StreamReader(resp.GetResponseStream(), true);
var html = respStream.ReadToEnd();
}
Now I change the framework in the csproj file from
<TargetFramework>netcoreapp3.0</TargetFramework>
to
<TargetFramework>netcoreapp2.2</TargetFramework>
and run the same code again. This time, it throws a 404 error:
System.Net.WebException: "The remote server returned an error: (404) Not Found."
Why does the exactly same code throw 404 errors in .NET Core 2.2? It's an relatively simple request and the HttpWebRequest API is not very new. I also got 404 errors when using the newer HttpClient. Does anyone know what have changed in .NET Core 3.0 that could explain this behavior?
I analyzed the requests using mitmproxy. It allows decrypting tls traffic, which is not easily possible using traditional network tools like Wireshark. It's very easy: Just download the mitmproxy binaries, execute the mitmweb binary and add the proxy to the HttpWebRequest instance:
req.Proxy = new WebProxy("http://localhost:8080");
req.ServerCertificateValidationCallback += (sender, certificate, chain, sslPolicyErrors) => true;
For simplicity, I didn't add the MITM certificate to the truststore, since it's just for local testing. After capturing a working request with .NET 3.0 and a failing 2.2, I saw that 2.2 redirects from https://www.zdnet.com/article/quantum-entanglement-breakthrough-could-boost-encryption-secure-communications/#ftag=RSSbaffb68 to https://www.zdnet.com/article/quantum-entanglement-breakthrough-could-boost-encryption-secure-communications/#ftag=rssbaffb68 where the second request returns 404.
3.0 seems to remove those hashtag, because it requests https://www.zdnet.com/article/quantum-entanglement-breakthrough-could-boost-encryption-secure-communications/ which works. So I fixed this in my .NET 2.2 application with a simple workaround:
var uri = new Uri(url);
url = $"{uri.Scheme}://{uri.Host}{uri.PathAndQuery}";
var document = GetDocumentNode(url);
This removes the hashtag parameter from the URL, resulting in no more 404 errors.

HttpClient on ASP.NET Core application times out when connecting to service on localhost

I have a XML-RPC server (using XML-RPC.net) running as a .NET console application. I'm trying to connect to it via my ASP.NET Core (2.1.1) web app but the client keeps timing out. Postman also returns a response immediately without issues.
Here is how I'm calling it:
HttpClient client = _clientFactory.CreateClient();
client.Timeout = TimeSpan.FromSeconds(10);
var httpRequest = new HttpRequestMessage(HttpMethod.Post, instance.ServiceUrl);
var stringContent = new ByteArrayContent(Encoding.UTF8.GetBytes(request.ToString()));
httpRequest.Content = stringContent;
httpRequest.Content.Headers.ContentType = MediaTypeHeaderValue.Parse("text/xml");
var httpResponse = await client.SendAsync(httpRequest);
var response = await httpResponse.Content.ReadAsByteArrayAsync();
I can see that the request was made successfully as the console app returns a response. Fiddler shows there was a 200 response but await client.SendAsync(httpRequest); times-out!
The request usually completes in under 10ms so the timeout value is just for debugging, if I leave it out it would take 60s. The response returns XML.
I've tried rewriting this to use StringContent and using PostAsync, same issue. I also attempted to rewrite this using WebClient but it returned The remote server returned an error: (100) Continue. not sure if that's relevant.
Been stuck on this for a whie, anyone know what could be happening?
OK I did some googling and it looks like I needed this line:
client.DefaultRequestHeaders.ExpectContinue = true;
It was definitely related to 100 status code returned not being handled properly.
Found it here:
https://social.msdn.microsoft.com/Forums/en-US/042016f0-d70e-42f9-9924-5febeb2bea86/excluding-the-quotexpect-100continuequot-header-from-httpwebrequest-posts?forum=winappswithcsharp

How to show a pdf file in a web page present on remote server

I am trying to show a pdf file present on my web hosting on html page.
My code is :-
Dim WC As WebClient = New WebClient()
WC.UseDefaultCredentials = False
Dim CREDS As CredentialCache = New CredentialCache()
CREDS.Add(New Uri("IPadress"), "Basic", New NetworkCredential("username", "password"))
WC.Credentials = CREDS
WC.Headers.Add(HttpRequestHeader.UserAgent, "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;")
Try
WC.DownloadFile("ftp://111.22.33.444/Folder/Folder/Folder/UPLOAD/File1.pdf","myFile.pdf")
Catch ex As Exception
End Try
This code runs in Windows App successfully, But in asp.net it gives an error that "REMOTE SERVER RETURNED AN ERROR: (404) Not Found"
Please help. Thanks in advance...
WebClient is for the http protocol. I believe you need to use FtpWebRequest instead. See here for documentation and example usage. If your ftp login places you in a subdirectory by default, you may want to see this article for a work around.

Status of Internal websites by using PING

Afternoon All,
Just after a bit of advice on the best method to use for the following.
I am new ish to .net and have an Asp.net web page in development that i simply lists some internal web sites by a ping command and outlines their status (on-line / offline). This is current;y activated by the click of a button.
I need to set up this developemt web page so that it automatically runs at a specific time on a morning say 7am for arguments sake and to then notify a user group by email the status of these items.
I have used Microsoft Visual studio (VB) 2010 before and can create simple web works that connect/ extract/ update data to and from SQL 2008. I have also had some experience in creating scheduled jobs in SQL but not much.
I thought i could maybe create a scheduled job in SQL 2008, find a way to populate the data into the database, use this data in my website and display it a gridview or something. And either have the SQL job or the website email a group of users the status of these internal web sites.
Does anyone know if i would beable to complete the above just in .net? Am i able to write a script of some sort or schedule the web page to run at a specified time?
Im not 100% sure on the best method to tackle this job and i have limited experience. Can anyone suggest any best method ideas on how to complete the above.
Regards
Bet.
Although fairly trivial to implement, I don't believe a ping command is useful in the context of what you are trying to achieve.
As Fredrik pointed out, a ping only says that the server is available. It makes no statement as to whether an individual website is functional on that server.
If I was doing this I would create a service that runs every so often. The service would issue a get request to the web sites, do a little bit of parsing on the content to make sure what was returned was expected, and update a record in a database stating the time of the connection and the status (up/down).
For example:
public String CheckSite(String postLocation) {
HttpWebRequest httpRequest = (HttpWebRequest)WebRequest.Create(postLocation);
// Setting the useragent seems resolves certain issues that *may* crop up depending on the server
httpRequest.UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)";
httpRequest.KeepAlive = false;
httpRequest.Method = "GET";
using ( HttpWebResponse httpResponse = (HttpWebResponse)httpRequest.GetResponse() ) {
using ( StreamReader reader = new StreamReader(httpResponse.GetResponseStream()) ) {
result = reader.ReadToEnd();
} // using reader
} // using httpResponse
return result;
}
This simple call will load a page from a server. From there you can parse to see if you have words like "error" or what have you. Provided it looks good then you report back that the site is up.
I know the above is C#, but you should be able to easily convert that to VB if necessary. You should also place a try .. catch around the call. If it errors out then you know the server is completely offline. If the page returns, but contains "error" or something then you know the server is up but the app is down.
pesronally . . . I think the most elegant solution would be: (untested)
public String CheckSite(String postLocation) {
HttpWebRequest httpRequest = (HttpWebRequest)WebRequest.Create(postLocation);
// Setting the useragent seems resolves certain issues that *may* crop up depending on the server
httpRequest.UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)";
httpRequest.KeepAlive = false;
httpRequest.Method = "GET";
using ( HttpWebResponse httpResponse = (HttpWebResponse)httpRequest.GetResponse() ) {
if(httpResponse.StatusCode != {check for valid status codes here})
{
//Do something based upon an invalid response.
}
} // using httpResponse
return result;
}

Url fragment is empty

Using ASP.NET MVC3, the url http://localhost:22713/tests#123456 with the following code:
Your user agent: #Request.UserAgent<br />
Url: #Request.Url.AbsoluteUri<br />
Url fragment: #Request.Url.Fragment<br />
returns:
Your user agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.16) Gecko/20110319 Firefox/3.6.16
Url: http://localhost:22713/tests
Url fragment:
Why is fragment always empty? I need to be able to parse this info on the server side.
The fragment (everything after the # in a url) doesn't get passed to the server. So the Fragment property will always be empty when you attempt to get it from a request.
The Fragment property is typically only used when constructing URLs.
There's no easy way to get the fragment on the server. Typically you would have to use javascript to retrieve the fragment.

Resources