We have a daily need to ship about 500 MB of compressed image files (about 280K each).
Currently we do this the fast easy way. A web server, and downloads via http.
We are now looking at the putting a better client (nw.js) on the client side. So we have the opportunity to improve the transport protocol.
Data flows only one way.
We have a couple of thoughts but I would love to hear better ideas.
Using a HTTP2 (SPDY) compliant server, and the Chromium hooks in
nw.js for HTTP2 receiving
Using a TCP connection (custom node.js server -> node code in nw.js)
Perhaps we should look at QUIC: https://www.chromium.org/quic
Would bundling this into a zip file (which would not decrease bytecount, since it is already compressed) help?
What does oneDrive, GoogleDrive, and dropbox do in these cases?
Any thoughts?
Has anyone tried ASPERA: http://asperasoft.com/software/transfer-clients/
Windows 10 systems can take advanatge of "TCP Fast Open" technology which you can read about here:
https://en.wikipedia.org/wiki/TCP_Fast_Open
To enable this technology, Chromium accepts this parameter:
--enable-tcp-fastopen
From what I've read this only works in Windows 10 but I've no idea about other platforms. Good luck.
Related
I am searching for a good method to transfer data over internet, and I work in C++/windows environment. The data is binary, a compressed blob of an extracted image. Input and requirements are as follows:
6kB/packet # 10 packets/sec (60kBytes per second)
Reliable data transfer
I am new to network programming and so far I could figure out that one of the following methods will be suitable.
Sockets
MSMQ (MS Message Queuing)
The Client runs on a browser (Shows realtime images on browser). While server runs native C++ code. Please let me know if there are any other methods for achieving the same? Which one should I go for and why?
If the server determines the pace at which images are sent, which is what it looks like, a server push style solution would make sense. What most browsers (and even non-browsers) are settling for these days are WebSockets.
The main advantage WebSockets have over most proprietary protocols, apart from becoming a widely adopted standard, is that they run on top of HTTP and can thus permeate (most) proxies and firewalls etc.
On the server side, you could potentially integrate node.js, which allows you to easily implement WebSockets, and comes with a lot of other libraries. It's written in C++, and extensible via C++ and JavaScript, which node.js hosts a VM for. node.js's main feature is being asynchronous at every level, making that style of programming the default.
But of course there are other ways to implement WebSockets on the server side, maybe node.js is more than you need. I have implemented a C++ extension for node.js on Windows and use socket.io to do WebSockets and non-WebSocket transports for older browsers, and that has worked out fine for me.
But that was textual data. In your binary data case, socket.io wouldn't do it, so you could check out other libraries that do binary over WebSockets.
Is there any specific reason why you cannot run a server on your windows machine? 60kb/seconds, looks like some kind of an embedded device?
Based on our description, you ned to show image information, in realtime on a browser. You can possibly use HTTP. but its stateless, meaning once the information is transferred, you lose the connection. You client needs to poll the C++/Windows machine. If you are prety confident the information generated is periodic, you can use this approach. This requires a server, so only if a yes to my first question
A chat protocol. Something like a Jabber client running on your client, and a Jabber server on your C++/Windows machine. Chat protocols allow almost realtime
While it may seem to make sense, I wouldn't use MSMQ in this scenario. You may not run into a problem now, but MSMQ messages are limited in size and you may eventually hit a wall because of this.
I would use TCP for this application, TCP is built with reliability in mind and you can simply feed data through a socket. You may have to figure out a very simple protocol yourself but it should be the best choice.
Unless you are using an embedded device that understands MSMQ out of the box, your best bet to use MSMQ would be to use a proxy and you are then still forced to play with TCP and possibly HTTP.
I do home automation that includes security cameras on my personal time and I use the .net micro framework and even if it did have MSMQ capabilities I still wouldn't use it.
I recommend that you look into MJPEG (Motion JPEG) which sounds exactly like what you would like to do.
http://www.codeproject.com/Articles/371955/Motion-JPEG-Streaming-Server
In My office website,webpage has 3css files ,2 javascript files ,11images and 1page request total 17 requests from server, If 10000 people visit my office site ...
This may slow the website due to more requests??
And any issues to the server due to huge traffic ??
I remember My tiny office server has
Intel i3 Processor
Nvidia 2Gb Graphic card
Microsoft 2008 server
8 GB DDR3 Ram and
500GB Hard disk..
Website developed on Asp.Net
Net speed was 10mbps download and 2mbps upload.using static ip address.
There are many reasons a website may be slow.
A huge spike in Additional Traffic.
Extremely Large or non-optimized graphics.
Large amount of external calls.
Server issue.
All websites should have optimized images, flash files, and video's. Large types media slow down the overall loading of each page. Optimize each image.PNG images have an improved weighted optimization that can offer better looking images with smaller file size.You could also run a Traceroute to your site.
Hope this helps.
This question is impossible to answer because there are so many variables. It sounds like you're hypothesising that you will have 10000 simultaneous users, do you really expect there to be that many?
The only way to find out if your server and site hold up under that kind of load is to profile it.
There is a tool called Apache Bench http://httpd.apache.org/docs/2.0/programs/ab.html which you can run from the command line and simulate a number of requests to your server to benchmark it. The tool comes with an install of apache, then you can simulate 10000 requests to your server and see how the request time holds up. At the same time you can run performance monitor in windows to diagnose if there are any bottlenecks.
Example usage taken from wikipedia
ab -n 100 -c 10 http://www.yahoo.com/
This will execute 100 HTTP GET requests, processing up to 10 requests
concurrently, to the specified URL, in this example,
"http://www.yahoo.com".
I don't think that downloads your page dependencies (js, css, images), but there probably are other tools you can use to simulate that.
I'd recommend that you ensure that you enable compression on your site and set up caching as this will significanly reduce the load and number of requests for very little effort.
Rather than hardware, you should think about your server's upload capacity. If your upload bandwidth is low, of course it would be a problem.
The most possible reason is because one session is lock all the rest requests.
If you not use session, turn it off and check again.
relative:
Replacing ASP.Net's session entirely
jQuery Ajax calls to web service seem to be synchronous
So when I am debugging my web applications and such, I've used the Charles web proxy and debugger and love it. It's so nice to see what's being sent and received via port 80 and 443. I can see all the resources loading, not just from the "browser" per say, but also flash applications. I can also see how the calls are being made, and it pretty easy to reconstruct them. It's a great debugging tool and I love it.
So I'm wondering two things:
First, I'm wondering is if there is something similar I can use to watch traffic that might be coming though on other ports. I guess some desktop applications will use the internet, but not necessarily via http / https requests. I remember looking at some security tools a few years ago - there are a lot of security tools out there, like kismet / etherCap, ethershark, etc - is there one that does what I'm describing in an easy and intuitive way?
Also, I'm wondering if I am using my iPhone / iPad / Android device, how can I set up a proxy through my computer so I can watch the http/https requests that the device makes?
Found the answer to that one here: http://www.ravelrumba.com/blog/ipad-http-debugging/
I'm mostly on a Mac so anything that is Mac friendly would be extra helpful.
Thanks!
I believe you are looking for Wireshark. It allows you to monitor the network interface on your machine and be able to tell you sent/receive packets as well as their protocols. It also has a protocol decoder that can be used to get Layer 7 information about a IP stream. You can also do a "Follow TCP stream" which allows you to view the entire conversation of that connection. It's based on libpcap (Packet capture) which the built in tcpdump also uses.
The only downside for you web developers is that if you're using SSL encrypted sessions, you can't decode it. The endpoints of the SSL session are "above" (using OSI model) the layer at which wireshark (and similar tools) operate.
Here's a good list http://sectools.org/sniffers.html. I used Wireshark back when it was Ethereal. At that time it ran under X11, It looks like that has changed.
Imagine you have a web site that you want to send a lot of data. Say 40 files totaling the equivalence of 2 hours of upload bandwidth. You expect to have 3 connection losses along the way (think: mobile data connection, WLAN vs. microwave). You can't be bothered to retry again and again. This should be automated. Interruptions should not cause more data loss than neccessary. Retrying a complete file is a waste of time and bandwidth.
So here is the question: Is there a software package or framework that
synchronizes a local directory (contents) to the server via HTTP,
is multi-platform (Win XP/Vista/7, MacOS X, Linux),
can be delivered as one self-contained executable,
recovers partially uploades files after interrupted network connections or client restarts,
can be generated on a server to include authentication tokens and upload target,
can be made super simple to use
or what would be a good way to build one?
Options I have found until now:
Neat packaging of rsync. This requires an rsync (server) instance on the server side that is aware of a privilege system.
A custom Flash program. As I understand, Flash 10 is able to read a local file as a bytearray (indicated here) and is obviously able to speak HTTP to the originating server. Seen in question 1344131 ("Upload image thumbnail to server, without uploading whole image").
A custom native application for each platform.
Thanks for any hints!
Related work:
HTML5 will allow multiple files to be uploaded or at least selected for upload "at once". See here for example. This is agnostic to the local files and does not feature recovery of a failed upload.
Efficient way to implement a client multiple file upload service basically asks for SWFUpload or YUIUpload (Flash-based multi-file uploaders, otherwise "stupid")
A comment in question 997253 suggests JUpload - I think using a Java applet will at least require the user to grant additional rights so it can access local files
GearsUploader seems great but requires Google Gears - that is going away soon
What application do you use to monitor HTTP communication on OS X?
Charles Proxy
Charles is an HTTP proxy / HTTP
monitor / Reverse Proxy that enables a
developer to view all of the HTTP
traffic between their machine and the
Internet. This includes requests,
responses and the HTTP headers (which
contain the cookies and caching
information).
Runs on JAVA. Available on OSX, Linux and Windows.
I like TcpCatcher. It is free and 100% java based so it works fine on Mac OS X.
Not only, you will be able to monitor HTTP communication but you will also be able to change requests / responses on the fly which opens very interesting possibilities..
There is a dedicated tutorial on capturing iPhone's HTTP communication.
If you're looking to trace application traffic, Wireshark is the best tool I've found - it can log and decode HTTP and many other protocols, and the GUI's search tools make finding the messages you're interesting in pretty quick and painless.
Other reasons I recommend this:
It's quick to install
It captures traffic straight from the network card, there is no need to change the application or set up proxies etc. It'll even read dumps captured from tcpdump and similar tools offline
It's multi-platform (works on Windows/Mac/Linux and others)
It's open source
HTTPTracer
http://simile.mit.edu/wiki/HTTPTracer
You could also use dTrace to monitor in even more detail, if that's what you need.
I second using Charles, it's a really excellent tool for HTTP examination. When used with the iPhone simulator (or any other OS X application) Charles automatically sets up the system settings to use itself as a proxy so you only have to launch and run. It also is very easy to examine the traffic in a few different ways, and has a very lenient free trial version that is fully featured (time limited to an hour with a few nag screens) so you can give it a good try.
Depends on what you mean by monitor...
If you simply want to know/stop when an installed application (or the OS) tries to "phone home", then I recommend LittleSnitch.
The peace of mind you gain is well worth the loss of weight from your bank account.