Detect time required to download file in client machine - asp.net

suppose there is a provision for downloading file from my site. so user can download file from site. i want to show info like how much time will be requited to download file in client machine. every client will have different internet speed so how can i detect and show time required to download file in client machine. is it possible in asp.net. please help me with sample code.

I cant give you a sample,, But I can give you an general idea to do it. You can let the page when it is fully loaded do an ajax call to a asp.net page that return around 1mb of dummy data. The javascript saves the timestamp when it starts the ajax call,, and saves the timestamp when the 1mb call is completed. the difference between the 2 timestamp is how long the connection will do about 1mb of data. With that knowledge you can calculate how long it will take for the download of XX mb.

A method I have seen on Microsoft's website for downloads is they provide a list of generally used speeds in a ComboBox.
It lets the user know how much time it will take with each type of connection.
Example:
Microsoft .NET Framework 4 (Web Installer)

Related

Right way to transfer a CSV file to a BI application?

We are doing a BI application, and our customers send us data files daily. We are doing data exchange using CSV files, because our customers are used to watch data with Excel, and they are not ready yet to use an API on their system (maybe in few years we will be able to use XML/JSON webservice, we hope).
Currently the data transfer is made with FTP (SFTP in fact). Our customers upload file automatically on an FTP server, and we have a CRON task that watches if a file has been sent.
But there are many disadvantages with that:
We cannot know with reliability if the upload is done, or still in progress (we asked them to upload a file with a temporary name, and move it after, but many of them still don't do that)
So, we can try to guess, and consider upload is done if enough time has passed. But FTP protocol doesn't allow to get server time, and time can be desynced. So we can upload an empty file and read it's date to know the time of the server. But we need write permission to do that...
FTP protocol allow to pause upload...
Then, we are considering to transfer files by asking our customer to upload them directly on our application, using HTTPS. This is more reliable, but less convenient:
Our customer cannot check the content of the file after upload
We have to be careful with upload size and timeout on our server
Files can be quite large (up to 300Mo), so it's better to zip them before upload (can reduce size to 10%).
This is more work for us than just an FTP server (we need to create UI, upload progress, list files to download them back, ...)
There is other solutions? How usually BI applications share data? Is HTTPS a good solutions for us?
We found a solution which is a webdav server. We are using Nextcloud, it provides an online interface, and script access with webdav protocol.
It's more reliable than FTP, because the file appear only when upload is done.
And it's better than HTTP upload on our application. We don't have to handle file upload, create interfaces, ...

Building ASP.Net web page to test user's connection (bandwidth)

We need to add a feature to our website that allows the user to test his connection speed (upload/download). After some research I found that the way this being done is downloading/uploading a file and divide the file size by the time required for that task, like here http://www.codeproject.com/KB/IP/speedtest.aspx
However, this is not possible in my case, this should be a web application so I can't download/upload files to/from user's machine for obvious security reasons. From what I have read this download/upload tasks should be on the server, but how?
This website has exactly what I have in mind but in php http://www.brandonchecketts.com/open-source-speedtest. Unfortunately, I don't have any background about php and this task is really urgent.
I really don't need to use stuff like speedtest
thanks
The formula for getting the downloadspeed in bit/s:
DataSize in byte * 8 / (time in seconds to download - latency to the server).
Download:
Host a blob of data (html document, image or whatever) of which you know the downloadsize on your server, make a AJAX GET request to fetch it and measure the amount of time between the start of the download and when it's finished.
This can be done with only Javascript code, but I would probably recommend you to include a timestamp when the payload was generated in order to calculate the latency.
Upload:
Make a AJAX POST request towards the server which you fill with junk data of a specific size, calculate the time it took to perform this POST.
List of things to keep in mind:
You will not be able to measure higher bandwith than your servers
Concurrent users will drain the bandwith, build a "limit" if this will be a problem
It is hard to build a service to measure bandwidth with precision, the one you linked to is presenting 5Mbit/s to me when my actual speed is around 30Mbit/s according to Bredbandskollen.
Other links on the subject
What's a good way to profile the connection speed of Web users?
Simple bandwidth / latency test to estimate a users experience
Note: I have not built any service like this, but it should do the trick.

What is a good way to generating thumbs?

On websites that allow you to upload images how are the thumbs generated?
I know imagemagick is typically used but the question is how does it work on the db end? I suspect it has a queue in a db and it can process N images at a time.
I'm talking about asp.net based websites for the moment
How does that queue work? One thought is put code in Application_Start to launch (a) thread(s) as a deamon which reads in the db for images that need to be process and sleep once no work is required.
Is this a good way or 'proper'? which way is 'good practice'
I would not start any external threads like that within an asp.net process due to recycling.
In many cases, you can probably do it in an async page right when uploaded. If not, then perhaps a separate app (service, perhaps) which handles the queue of thumbnails needing to be generated.
On our asp.net site we have used standard .Net implementation of image re-sizing and perform it on the user post of picture. Idea behind that time of post much more longer than processing of image, so place to DB 2 images (thumb and origin) is much more faster than traverse it over HTTP.
If you really need more time processing images and video you might want to consider writing a windows service to do the processing.
You store the raw data in a temp folder and add an entry to some table in your DB which is constantly read by your service so it knows when to process data, it then reports back to your DB so you webapp knows when the data is processed. In a simpler way you could have a service monitor a folder and process whatever you put in there, moving the processed files to a 'processed files' folder.

Need to check uptime on a large file being hosted

I have a dynamically generated rss feed that is about 150M in size (don't ask)
The problem is that it keeps crapping out sporadically and there is no way to monitor it without downloading the entire feed to get a 200 status. Pingdom times out on it and returns a 'down' error.
So my question is, how do I check that this thing is up and running
What type of web server, and server side coding platform are you using (if any)? Is any of the content coming from a backend system/database to the web tier?
Are you sure the problem is not with the client code accessing the file? Most clients have timeouts and downloading large files over the internet can be a problem depending on how the server behaves. That is why file download utilities track progress and download in chunks.
It is also possible that other load on the web server or the number of users is impacting server. If you have little memory available and certain servers then it may not be able to server that size of file to many users. You should review how the server is sending the file and make sure it is chunking it up.
I would recommend that you do a HEAD request to check that the URL is accessible and that the server is responding at minimum. The next step might be to setup your download test inside or very close to the data center hosting the file to monitor further. This may reduce cost and is going to reduce interference.
Found an online tool that does what I needed
http://wasitup.com uses head requests so it doesn't time out waiting to download the whole 150MB file.
Thanks for the help BrianLy!
Looks like pingdom does not support the head request. I've put in a feature request, but who knows.
I hacked this capability into mon for now (mon is a nice compromise between paying someone else to monitor and doing everything yourself). I have switched entirely to https so I modified the https monitor to do it. The did it the dead-simple way: copied the https.monitor file, called it https.head.monitor. In the new monitor file I changed the line that says (you might also want to update the function name and the place where that's called):
get_https to head_https
Now in mon.cf you can call a head request:
monitor https.head.monitor -u /path/to/file

Application Design - Daemon w/ WebPage FrontEnd

I have an application that scans an input directory every five seconds and when a job (i.e. file) is placed in the directory the application reads it, processes it and outputs another file(s) in an output directory.
My question is, If I wanted to put a web based front end on this application, how would I wait for the processing to be complete?
User submits job
job is placed in input directory
......What am I doing on the web page here?
processing occurs
output file is generated
......How do I know that the job is finished?
The two solutions I came up with were:
poll output directory every x seconds from the webpage
use ajax to poll a webservice or webpage that reports back whether the output file is present
Is there a better design for the server? In other words, how would TCP or Named Pipes help in this situation? (Cannot use remoting due to a DCOM object.)
A solution we have done commercially in the past is basically the daemon writes to a log (typically DB), with a date/time stamp, about what its doing, and the web frontend just shows the latest X amount of entries from the log, with a little toggle to hide all of the "Looked in directory, no file found" messages, worked fairly well, we upgraded it later on with AJAX (Timer that reloaded every 20 seconds).
I don't think that Named Pipes are going to make it any easier to get the web client to poll automatically, but it might make the server better able to notify another process that the conversion has completed--and ultimately queue a message to the web browser.
You can try having the web client poll every few seconds to see if the file process has completed, alternatively you could have something like Juggernaut "push" commands out to the page. Juggernaut works using Flash to open a socket on the web browser that continually feeds JavaScript from the server. It could be responsible for sending a command to alert the browser that the file has completed and then issue a redirect.

Resources