Any tool for WebDAV load testing? - webdav

I want to generate WebDAV traffic (to reproduce a bug in an open source server).
I know WebDAV is a layer over HTTP so I could generate such traffic by reading the protocol specification and sending tens of hand-crafted HTTP requests, but I don't want to re-invent the wheel, I guess someone has already written such a tool?
Ideally the tool would:
Log in at a URL
Randomly navigate directories
Download/upload files from time to time.
Bonus if it is free/open source.
This discussion suggests JMeter does not have this feature.

I found a tool called "Prestan" on Sourceforge. It's based on this paper by Teng Xu

There was no such tool, so I wrote one.
Mecadaver logs in at a WebDAV URL, and starts navigating directories at random, sometimes downloading documents, sometimes going back to the root.
Runs as as many users as you want (you provide a list of usernames/passwords in a CSV file)
Various parameters to make the load more intensive or less intensive.
Open source
Note: Silk Performer can do WebDAV test loads as well, but it is very expensive.

Related

How to serve a PDF file via a HTTP request

I'm working with some new SCADA software, which uses a browser environment to display everything. One component that the software has is a PDF viewer, however, since we're in a browser environment, it can only load files that are served up over HTTP. According to the forums, this means that the source of the PDF needs to be a URL.
The forum also notes that I can use one of their modules (WebDev) to "stream the PDF bytes over HTTP", and provides directions for how to do so. However, the WebDev module is outside the budget of my project (it's quite a high-powered module, I'd be paying a premium price and then using 1% of its functionality). So I'm wondering if it's possible to serve up a PDF via HTTP some other way.
I'm not an experienced programmer - I'm self taught out of necessity on a small handful of languages, and to a basic level only. As such, I don't fully understand the problem, nor do I know what search terms to use to find the sort of information I need to solve it.
If anyone's able to provide a partial solution, or even just able to help me understand what I'm asking for and where to go looking for some answers, I'd appreciate it!
The PC hosting the PDF files and the SCADA gateway is running Windows 10.
I had the same issue integrating .pdf report to our SCADA system having web interface and running node.js at backend.
The main point is:
Generate your pdf in client end (web interface)
Convert it to Base64 format as URi
Preview on DOM or send it to server!
Send excel and pdf to server side
hope that helps!

Batch HTTP download standard

I have multiple files available on HTTP server which users can download and I want to provide a single download link which will allow the user to one click download multiple files (using just web browser or download manager). The files are very big so packing them into archives on the fly and providing zip download is not an option.
Is there any combination of internet standard plus download manager which will allow for such batch HTTP downloads?
I know there is metalink(wiki) standard, but it doesn't seem to be very popular and there are almost no client application supporting it:
DownThemAll - soon can be discontinued because Mozilla is going to deprecate their XUL extension API
aria2c - although works great is a command line only client which is not really user friendly
Torrent I would guess can't be used with simple HTTP download, they require torrent protocol, right?
Do you know any other alternative, except inventing own standard and/or writing own download manager?

Accessing a console application from web page

I've recently created two C# console applications. The first transforms a bunch of command outputs into an XML, and the second transforms the XML into a Word document using a template.
I'd like to know how I could get this onto the web, i.e having a web page where the command output can be uploaded, the two step conversion executed, and finally the Word document made available for download.
Should the web page be created in ASP.NET or are there other (better) options? Do I need to rewrite the console applications in some other format?
This question is fairly broad, with plenty of room for novel sized explanations, but here's a brief highlevel walk through of what likely needs to happen to achieve the proposed results (language agnostic):
Get a hosting provider that allows users to spin up their own machine (i.e. AWS).
Spin up a machine that is compatible with the "console" programs in question.
Install "console" programs on machine.
Install a programming language (i.e. Node.js, PHP, ASP.NET, even C# could do) on the machine.
Install a web server (i.e. NGINX, Apache) on machine, configure it to serve public requests and run with chosen language.
On server request, execute appropriate commands from within the chosen language. Languages typically come with a exec method (i.e. in node.js: require('child_process').exec(command,options,callback))
Get the results of said commands and send it back to the client. Alternatively (for downloads), write the result to a path on the system that is publicly available to the internet and redirect the user to that url (additional configuration might be required to make sure the browser downloads the file as oppose to just serving it).
The steps above should get you pretty close to that you want. As for your questions:
Should the web page be created in ASP.NET or are there other (better)
options?
The "better" options is whatever you feel most comfortable with at the moment, you could always change it later with reasonable effort (assuming that your "console" apps are not unsuspecting unicorns).
Do I need to rewrite the console applications in some other format?
No, unless you have strong reasons to do so (i.e. multi environment compatibility). You could also rewrite to significantly simplify (i.e. bypass working with a CLI and do everything in C#).
Try thinking through these high level steps, begin working on a implementation, and post more specific questions here on StackOverflow when you get stuck.
I hope that helps!

Is there a solution for a BitTorrent Uploader?

I have a requirement by my client to be able to upload extremely large files.
I'm talking about 7 GB files. The website they are currently running on is a ASP.NET 4.0 app, so obviously the standard upload scheme for my web app is not going to work.
I'm tossing around multiple options trying to figure out what the best route to go would be.
One option I'm thinking about seeing if I can do would be to have a BitTorrent Uploader. The end users for this app will typically have the same file on hand, so the idea would be that an end user would go to the site, say that they wanted to upload a file. At that point, they would pick the file, and then the server would immediately mark that person as a seed for that file. Then, my web app would go to a preconfigured leech on our side, and instruct the leech to download the file. I would expect at some point during or after this process the torrent would do some magic to find other seeders on the client's network, or wherever, but that's the idea.
Is there any technology out there already that does this? Or am I describing something that I'm going to have to build from the ground up?
It doesn't sound like it's going to be easy to do this with BitTorrent. In order for BT to work, you need torrent files. In order to create a torrent file for a particular file, you need that file (the torrent file basically contains a hash of the file). In general for a torrent, you need a tracker. You could rely on a public one, but that could be a risky dependency. You could operate your own, but that has other challenges (for one, you'd have to make sure it's locked down so it doesn't become a free-for-all for all the latest movies, music & TV).
Assuming you have a tracker in place, you then need to coordinate the downloading of torrents. Your users are going to have to create the torrent files, which is an extra complicated step, then presumably upload them via usual HTTP methods. As well as getting the user to upload the torrent, you'd have to remind the user to start seeding the torrent in their client of choice. You'd then want to automatically begin leeching the torrent (again, security issue here - what if a user uploads a completely unrelated torrent for the latest episode of House?). Apart from the security problem, this is probably the easiest part - most torrent clients can be configured to watch a directory and automatically start downloading torrent files in that directory. Once you've started downloading, you have to make sure that the user continues seeding the torrent until you've completed, otherwise you'll be stuck with a useless partial file.
It could all work, but without a fair bit of customisation work it's going to be a convoluted process at best for your users, and quite possibly beyond them. Obviously I don't know your specific requirements, but I'd be looking at more traditional file transfer protocols, like FTP.....

Is there a way to speed up file uploads?

I understand that file upload speeds are limited by the upload speed of the internet connection among other things. Is it possible to use jquery or some other method to compress the file locally before upload and then upload a file to the server? Any other solutions?
While others have already provided answers, one thing you might be able to do (depending on how your website is setup) is, once the user has chosen the file, begin the upload process immediately. That way, if the user has to fill in additional information about the file (maybe a description of the file, a different name for the server, keywords, etc), their file is uploading in the meantime, and the information can be provided later.
Other than that, you're SOL.
If upload speed is a concern, perhaps consider a client side application the user has to download.
Or a flash based uploader. Using flash you'd get more control over the upload and it is consistent across browsers. This is what YouTube does to allow 2GB video uploads with minimal stress on the user's part. It doesn't make it faster if the client's connection is poor, but it helps with the reliability of the upload.
The browser already takes care of all the little optimizations that would make it faster on the client side, so no, you can't really use JavaScript to speed up a file upload. There isn't much you can do if the clients connection is the bottle neck.
No, you can't read the local filesystem in JavaScript. You can't do it with Flash or Java under the default config, either (with the partial exception of Flash 10). Further, there is no standard way to send compressed requests (the way there is for responses).
The upload time will be determined by a variety of factors. So, for example, network speed, web server response time, upload file size, and so on. Check with your IT department and go over those points. If the issue is one of file size, there are ways to compress and reduce file size in Android. Refer to this sample code for Android compression and document capture.
https://github.com/ExtrieveTechnologies/QuickCapture

Resources