Are there any browsers that support HTML5's Canvas that don't default to an 'Accept-Encoding' of gzip? - http

I'm creating a webapp where upon connecting to my server, you will have one simple HTML page downloaded with one Canvas element in said page. If your browser doesn't support Canvas, you'll get a message telling you to upgrade your browser in it's place. If Canvas works, then there'll be some interactivity between my server and the canvas element.
Since I'm writing my own server, I don't really feel like properly adhering to the W3C standards for dealing with 'Accept-Encoding', since writing a function to properly check which compression is ok is something I'd rather avoid (since there are a lot of other things I'd rather work on in my webapp). However, I feel like if a browser can support HTML5's canvas, then I can assume that it'll deal just fine with Gzipping, and I can have all the interactivity between the browser and my site be Gzipped without worrying about failure.
Does anybody know of any browsers that have HTML5 capabilities (specifically Canvas in my case) but take issue with Gzipped HTTP responses?
NOTE - I have had 0 experience with non-desktop browsers. My app isn't targeting mobile devices (resolution isn't large enough for what I'm working on), but I would be curious to know whether or not this holds for mobile browsers as well.
Best, and thanks for any responses in advance,Sami

Note that while I cannot think of any browsers with this limit, HTTP proxies might impose the limit. Since this is at the transport layer, you can't guarantee support for optional portions.

I would advise against making any such assumptions.
The browser in question may support Canvas, but it could still sit behind a proxy which for some unknown reason does not support gzipped responses.
You could instead put your custom web server behind a proxy that is widely used, such as Apache or Squid, and let that proxy negotiate with the client for you. This way your own web server would only have to deal with a single client, which could simplify its implementation significantly. This intermediate proxy could also take care of many security issues for you so that you won't have to worry quite as much about hackers pwning your web server.

Here's an article indicating the 10% of browsers did not support gzip as of 2009: http://www.stevesouders.com/blog/2009/11/11/whos-not-getting-gzip/
That being said, I would think any browser that has support for canvas would also support gzip (it is an easy piece of code to add).

Related

Why is HTTP/2 slower for me in FireFox?

There's a very interesting HTTP/2 demo that Akamai have on their site:
https://http2.akamai.com/demo
HTTP/2 (the future of HTTP) allows for concurrently downloaded assets over a single TCP connection reducing the need for spritesheets and concatenation... As I understand it, it should always be quicker on sites with lots of requests (like in the demo).
When I try the demo in Chrome or Safari it is indeed much faster, but when I've tested it in FireFox it's consistently SLOWER. Same computer, same connection.
Why is this?
HTTP/2 is apparently supported by all major browsers, including FireFox, so it should work fine, but in this real world demonstration it is slower 80% of the time. (In Chrome and Safari it's faster 100% of the time.)
I tried again on the following Monday after ensuring I'd cleared all my caches:
My OS: El Capitan Version 10.11.3 (15D21) with FireFox Version 44.0.2
UPDATE (APR 2016)
Now running Firefox 45.0.1:
Still slower!
You seem to have a pretty small latency and a very fast network.
My typical results for HTTP/1.1 are latency=40ms, load_time=3.5s, and HTTP/2 is consistently 3 times faster.
With a network such as yours, other effects may come into play.
In my experience one of the most important is the cipher that is actually negotiated.
HTTP/2 mandates the use of very strong ciphers, while HTTP/1.1 (over TLS) allows for far weaker, and therefore faster, ciphers.
In order to compare apples to apples, you would need to make sure that the same cipher is used. For me, for this Akamai demo, the same cipher was used.
The other thing that may be important is that the HTTP/1.1 sources are downloaded from http1.akamai.com, while for HTTP/2 they are downloaded from http2.akamai.com. For me they resolve to different addresses.
One should also analyze how precise is the time reported in the demo :)
The definitive answer can only come from a network trace with tools like Wireshark.
For networks worse than yours, probably the majority, HTTP/2 is typically a clear winner due to HTTP/2 optimizations related to latency (in particular, multiplexing).
Latency matters more than absolute load time if you're mixing small and big resources. E.g. if you're loading a very large image but also a small stylesheets then HTTP2's multiplexing over a single connection that can have the stylesheets finish while the image is still loading. The page can be rendered with the final styles and - assuming that the image is progressive - will also display a low-res version of the image.
In other words, the tail end of a load might be much less important if it's caused by a few big resources.
That said, the demo page actually loads http2 faster for me on FF nightly most of the time, although there is some variance. You might need better measurements.

Serving HTTP version of site to those who don't support HTTP2

I'd like to move my client's site entirely to HTTPS in order to allow HTTP2 to work, however I was wondering is it ok (in the eyes of search engines) to serve older traffic (of which there is a lot and which would otherwise suffer a perf hit) that do not support HTTP2?
Is this dangerous to do from an SEO point of view? and
could you do the detection with tools like WURFL?
I want to stay current and offer improved perf/security to those on newer browsers but don't want those on older browsers in developing countries to suffer.
For what is worth, I did some tests a few weeks ago and I got the impression that Google's spiders don't see HTTP/2 yet. But as #sbordet pointed out the upgrade to HTTP/2 is optional, so just be sure to have a site that also responds to HTTP/1.1. Here are a few thoughts more:
Google's algorithms will penalize slower sites, but it is unlikely that you will take a big performance hit from using HTTPS in your servers.
Using HTTPS can actually boost your SEO. Doesn't have anything to do with HTTP/2.
Popular browsers that don't support HTTP/2: Safari and IE. Safari doesn't support any TLS crypto-suite compatible with HTTP/2, AFAIK. But that won't cause problems as long as you list HTTP/2-compatible suites first in your TLS server hello: ECDHE-RSA-AES128-GCM-SHA256 and ECDHE-RSA-AES256-GCM-SHA384 are the ones I know of. Then you can list weaker suites.
You don't need to serve different content depending on whether you use HTTP/2 or HTTP/1.1, as your question title may hint (sorry if I misunderstood).
Also, just because you updated to HTTP/2, it does not mean that your server cannot serve HTTP/1.1 anymore.
You can easily update to HTTP/2, and retain HTTP/1.1 support for older devices or networks that do not support or do not allow HTTP/2 traffic.
Whether a client and a server can speak HTTP/2 is negotiated: only if the server detects that the client supports it, then it will use it, otherwise the server will fallback to HTTP/1.1. Therefore you don't risk to make your site unavailable for older browsers in developing countries.
Then again, HTTP/2 implementations may vary, but typically they have to be prepared to clients that don't speak HTTP/2, and use HTTP/1.1 for those (because otherwise they won't be able to serve content and it will appear that the service is down).

Since HTTP 2.0 is rolling out, are tricks like asset bundle still necessary?

How can we know how many browsers support HTTP 2.0?
How can we know how many browsers support HTTP 2.0?
A simple Wikipedia search will tell you. They cover at least 60% of the market and probably more once you pick apart the less than 10% browsers. That's pretty good for something that's only been a standard for a month.
This is a standard people have been waiting for for a long time. It's based on an existing protocol, SPDY, that's had some real world vetting. It gives some immediate performance boosts, and performance in browsers is king. Rapid adoption by browsers and servers is likely. Everyone wants this. Nobody wants to allow their competitors such a significant performance edge.
Since http 2.0 is rolling out, does tricks like asset bundle still be necessary?
HTTP/2 is designed to solve many of the existing performance problems of HTTP/1.1. There should be less need for tricks to bundle multiple assets together into one HTTP request.
With HTTP/2 multiple requests can be performed in a single connection. An HTTP/2 server can also push extra content to the client before the client requests, allowing it to pre-load page assets in a single request and even before the HTML is downloaded and parsed.
This article has more details.
When can we move on to the future of technologies and stop those dirty optimizations designed mainly for HTTP 1?
Three things have to happen.
Chrome has to turn on their support by default.
This will happen quickly. Then give a little time for the upgrade to trickle out to your users.
You have to use HTTPS everywhere.
Most browsers right now only support HTTP/2 over TLS. I think everyone was expecting HTTP/2 to only work encrypted to force everyone to secure their web sites. Sort of a carrot/stick, "you want better performance? Turn on basic security." I think the browser makers are going to stick with the "encrypted only" plan anyway. It's in their best interest to promote a secure web.
You have to decide what percentage of your users get degraded performance.
Unlike something like CSS support, HTTP/2 support does not affect your content. Its benefits are mostly performance. You don't need HTTP/1.1 hacks. Your site will still look and act the same for HTTP/1.1 if you get rid of them. It's up to you when you want to stop putting in the extra work to maintain.
Like any other hack, hopefully your web framework is doing it for you. If you're manually stitching together icons into a single image, you're doing it wrong. There are all sorts of frameworks which should make this all transparent to you.
It doesn't have to be an all-or-nothing thing either. As the percentage of HTTP/1.1 connections to your site drops, you can do a cost/benefit analysis and start removing the HTTP/1.1 optimizations which are the most hassle and the least benefit. The ones that are basically free, leave them in.
Like any other web protocol, the question is how fast will people upgrade? These days, most browsers update automatically. Mobile users, and desktop Firefox and Chrome users, will upgrade quickly. That's 60-80% of the market.
As always, IE is the problem. While the newest version of IE already supports HTTP/2, it's only available in Windows 10 which isn't even out yet. All those existing Windows users will likely never upgrade. It's not in Microsoft's best interest to backport support into old versions of Windows or IE. In fact, they just announced they're replacing IE. So that's probably 20% of the web population permanently left behind. The statistics for your site will vary.
Large institutional installations like governments, universities and corporations will also be slow to upgrade. Regardless of what browser they have standardized on, they often disable automatic updates in order to more tightly control their environment. If this is a large chunk of your users, you may not be willing to drop the HTTP/1.1 hacks for years.
It will be up to you to monitor how people are connecting to your web site, and how much effort you want to put into optimizing it for an increasingly shrinking portion of your users. The answer is "it depends on who your users are" and "whenever you decide you're ready".

HTTP Tools for analysis and capture of requests/response

I am looking for tools that can be used for debugging web applications.I have narrowed my search to the following tools:
HTTPwatch.
Fiddler.
ieHTTPheader
liveHTTPheader.
It would be great if some of you having experience with these tools could discuss their pros and cons (features that you like or you think are missing in some of the tools but present in others).I am majorly confused between HTTPWatch and Fiddler, I would prefer Fiddler (being free) if it could fullfill all or most of HTTPWatch's features (however I am ready to pay for HTTPWatch if it's worth it).
P.S. - I know HTTPWatch and Fiddler are far more powerful than the other two tools (let me know if you disagree).
I am sure most of you would want more details as to what I would exactly like to do with these tools however I would like if you could compare these tools taking a broader perspective in mind comparing them as tools in general.
** Disclaimer: Posted by Simtec Limited **
Here's a list of the main advantages of HttpWatch (our product) and Fiddler. Of course we're biased, but we've tried to be objective:
HttpWatch Advantages
Shows requests that were read from
the browser cache without going onto network
Shows page level events, e.g. Render Start, DOM Load, etc
Handles SSL traffic without certificate warnings or requiring changes to trusted root CAs
Reduces 'observer effect' by not requiring HTTP proxy at network level
Groups requests by page
Fiddler Advantages
Works with almost any HTTP client not just Firefox and IE
Can intercept traffic from clients on non-Windows platforms, e.g. mobile devices
Requests can be intercepted and modified on the fly, e.g. change cookie value
Supports plugins to add extra functionality
Wireshark works at the network layer and of course gives you more information that the other tools you have mentioned here, however, if you want to debug web applications by breaking on requests/responses, modifying them and replaying - Fiddler is the tool for you!
Fiddler cannot however show TCP level information however and in such cases you will need Network Monitor or Wireshark.
If you specify what exactly you want to do with the 'debugger', one can suggest what's more appropriate for the job.
Fidler is good and simple to use. Wireshark is also worth considering since it gives a lot of extra information
You could also use Wireshark which allows you to analyze many protocols including TCP/IP.
A lab exercise from a University lecture on using Wireshark to analyze HTTP can be found here: Wireshark Lab: HTTP
take a look at HTTP Debugger Pro
It works with all browsers and custom software and doesn't change proxy settings.

Issues with HTTP Compression?

We are investigating the use of HTTP Compression on an application being served up by JBoss. After making the setting change in the Tomcat SAR, we are seeing a compression of about 80% - this is obviously great, however I want to be cautious... before implementing this system wide, has anyone out there encountered issues using HTTP Compression?
A couple points to note for my situation.
We have full control over browser - so the whole company uses IE6/7
The app is internal only
During load testing, our app server was under relatively small load - the DB was our bottleneck
We have control over client machines and they all get a spec check (decent processor/2GB RAM)
Any experiences with this would be much appreciated!
Compression is not considered exotic or bleeding edge and (fwiw) I haven't heard of or run into any issues with it.
Compression on the fly can increase CPU load on the server. If at all possible pre-compressing static resources and caching compressed dynamic responses can combat that.
It just a really good idea all the way around. It will add slight CPU load to your server, but that's usually not your bottleneck. It will make your pages load faster, and you'll use less bandwidth.
As long as you respect the client's Accept-Encoding header properly (i.e. don't serve compressed files to clients that can't decompress them), you shouldn't have a problem.
Oh, and remember that deflate is faster than gzip.

Resources