Why do browsers not have a file upload progress bar? - http

I wonder why no browser out there has such simple but essential feature. Am I missing something? Is there a technical reason?
I'm tired of all those javascript/flash/java hacks out there ...

There is no technical reason preventing the browser from calculating the total bytes to be sent and then tracking how many have been received by the server (Thanks, Kibbee for your comment). Firefox had a functional upload progress indicator until version 0.9, but that build broke it in 2004.
Reading through the Bugzilla updates, it seems that this feature doesn't seem to benefit enough users to get any traction from the developers.
Users who regularly upload very large files tend to use tools like FTP that are designed for this purpose, so they are not affected.

Adding to flamingLogos argument, you might operate behind a proxy which takes your five megabytes of pure goodness within a second, and then sends it off to the server over a 56kbit modem.
I perceive a wrong progress bar slightly worse than no progress bar at all, and there would be many people for who it would be wrong all of the time.

Yes, it's silly, and for some reason browser makers are ignoring it.
I would strongly dispute that large file users use FTP - hardly anyone knows about that anymore and all the common Web apps require HTTP uploads for video, audio and pictures (e.g. youtube).
Ironic that user participation and media is the key to Web 2.0, yet the main mechanism for user participation is so poorly handled by browsers.
For Firefox there have been bugs languishing for years, such as for a better upload progress display:
https://bugzilla.mozilla.org/show_bug.cgi?id=243468
Get voting! :)
The existing progress bar in the status bar is broken for years - see bug 249338 - and it will let you silently abort an upload - see bug 432768.

If you are using Firefox, you can use the new UploadProgress add-on https://addons.mozilla.org/en-US/firefox/addon/221510/ designed for this purpose, that is displaying the progress of your uploads and an estimated remaining time.

You have to post back to upload a file, regardless of whether or not you are being "sneaky" about it (using hidden iframes, for example); the browser's own progress bar (usually down in the status bar) is the file upload progress bar in that sense, although not exactly.
It's just that you can't easily use that data for yourself, so you have to approximate it with a lot of client-to-server communication tricks.

There's no real technical reason you couldn't have a reasonable progress indicator as you do with downloads. You should suggest it as a feature request to your favorite browser.
That said, I think the main reason there are so many javascript/flash/ajax-based upload components isn't so much to provide progress bars (though that's a nice bonus). It's usually because they want to provide a better UI for selecting the data to be uploaded and to sometimes manipulate the data before uploading. The basic file upload feature that's in the HTML specs results in the "Browse..." button that pops up a file open dialog and uploads the raw file data as is to the server.

Chrome has an upload bar that shows the % of loading.
Or, like Peuchele says, there's also an Addon for Firefox.

The web browser has always been that, a browser of the web. It is a mechanism for consumption. Our ability to upload information through the same portal is somewhat of a hack.

Related

Google map not responding on android's native browser after 4-5 times zoom in and/or area navigation

I have developed a mobile site using Icefaces-mobile framework in that I want to search the things based on area zoomed in on the mobile screen, for this users can zoom in/out & navigates the area but the problem is after doing zoom in/out/navigations 4 to 6 times browser becomes unresponsive. I think it must be problem with browsers capability of handling/execute the javascript(correct me if I m wrong).
I have generated latest API key for map using some standers steps given by google on their forum.
Thanks in Advance.
The stock Android browser does suffer from serious limitations, and it could be device performance limitations, as the previous poster mentioned. But you should also be careful that you're not inadvertently causing a memory leak in your own JavaScript. Are there DOM updates from Ajax interactions being generated during this? Check the Android LogCat messages to see the ICEfaces/ICEmobile logging which will show the updates. Check to ensure that if you're custom function is being re-run that you're not causing a memory leak there. If not, I doubt there's an issue with the GMap code and it's likely just a device limitation. Perhaps also put your gmap code in a separate HTML page without ICEmobile and see if you still have the problem.

cache background image

Is there a way to "cache" background image.
For example..
Background image is 3x3px and it's set like this:
body {
background: #000 url(bg.png);
}
When refresh happens, background image "flickers" for second.
Is there a cross-browser solution? (for Apache/PHP server if that is relevant)
If you go to seo.hr and browse navigation,... you can see what I'm trying to do.
http://www.seo.hr/
http://www.seo.hr/usluge/izrada-stranica
http://www.seo.hr/usluge/optimizacija-za-trazilice
I think you need to determine first if the issue actually is a caching issue or if it's caused by the size of your image. You could use a program like Wireshark or Fiddler to do this, but to be honest it's overkill for your need and you probably already have a browser with developer tools.
Here's how you determine where an image is coming from in Chrome (the other browsers are similar).
Open your developer tools and go to the "Network" tab.
Find "bg.png" in the list of network requests and click on it's name. Below is an example of having selected a stack overflow image from this page.
Notice that it says status 200 (from cache). The browser didn't need to go out to the server and rerequire that resource. It used the cache. If that "from cache" text wasn't there it wasn't reusing cached resources.
There is also the potential that you'll get a status code of 304. That means that the server said the image wasn't modified since the last request that you made. You do make the server trip in that case.
Ok, so my image wasn't in cache... now what?
There are a few reasons that this could occur.
You're request headers aren't set to tell the browser to cache the image (also found in that same "Headers" tab that you would have seen that Status Code if the browser actually went to the server for the image). You'll want to set cache-control and expires to something that makes sense for you. Cache headers can get a bit complicated you may want to browse through this caching tutorial document.
Is it SSL? If so not all browsers cache this but most modern browsers do. Set cache-control: public on these images (and also expires).
The real question here is how do you fix this? Unfortunately, that's entirely dependent on the server and/or the framework that you are using. As the OP is using Apache, they can find great documentation on the Apache module mod_expires to figure out how to tweak caching for their site.
Yes!
You should decide whats more suitable for you, but at this time we have some methods, like:
Pure HTML/CSS
Javascript Only
Mixed HTML/CSS/Javascript
Using base64 to encode the image somewhere on the source code
At this point I recommend a mixed solution, using javascript. This will make it work on many browsers as possible.
There is a good tutorial at:
http://perishablepress.com/press/2009/12/28/3-ways-preload-images-css-javascript-ajax/
Having several images in one can take you a step beyond that, so check this sprites article:
http://www.alistapart.com/articles/sprites/
You can try to encode your image in base64 and put it directly into CSS source code. I found a question about pros and cons over here.
Make your tiled image much much larger, when the browser engine renders the page it has to multiply each tile to cover the entire width and length of your object, which results in bad performance with small tiles on large objects.
Small tiles -> more repetitions -> slower performance

Is there an ready solution to just send part of interlaced JPEG depending on the browser resolution?

I'm asking if you know if there is a ready-made solution, not really how to do it.
I'm quite sure I can pull it off myself, even if I never ever touched the bytes of a JPEG manually. If you'd like a crack on it, you're invited to do so ;)
The basic Idea is that you have a site with a few JPEG images, but you want to reduce load as much as possible for mobile users.
So you ensure that all of your JPEG´s are progressive and only sends the low-frequency bits of it first, idles down the TCP-connection, and waits for the client to report in how big the available space is in the browser window.
Or alternatively, you have some sort of browsercaps.ini or similar, and rely on that to get the initial resolution -- and then have the reporter report a correction if necessary.
I actually needs this for two entirely separate environments, one is using PHP and the other is using node.js (The latter one is of more importance).
I'm quite sure picasaweb is doing this stuff already, or at least did. You could view an image, and it loads progressively -- then you could enlarge it, it got blocky but continued to load in progressively, I remember that I was quite impressed by that!
(And its unfair that Google keep the cool stuff for them selves, remember their motto {°«°] )
Why not send the client a list of images that could be used for a specific img tag, then have the client determine which one it should use?
It is possible to determine the screen size of the device document.write(screen.width+'x'+screen.height);or the size of the browser. And instead of adding a src attribute for each image, adding the possible sources to a html5 data- attribute like so:
<img data-img="mobile:some-img.jpg,desktop:other-img.jpg" />
JavaScript (With jQuery):
$('img').each(function(){
$(this).attr('src', $(this).attr('data-img').split(',')[0].split(':')[1]);
});

Disable print, print screen, right click using asp.net

How to Disable print, print screen, right click using asp.net
You cant. You cannot avoid content being copied from your pages.
Disabling Right Click is possible, but it doesnt solve your 'problem'. The user could still copy your image, by disabling javascript or just inspecting the source.
And even if you could disable those keys, the user could still just make a photo of his monitor. Good luck disabling that!
Short answer: You don't. You are writing a web application; features of the underlying platform are outside your scope, and you have no business trying to fiddle with them.
Long answer: You can try to capture those keys using javascript, and override the default behaviour, which will somewhat stop very naïve users, but all it takes to disable this "security" is to turn off javascript. Even if you come up with more sophisticated "protection", the essence remains: You are sending content to the client, and once it gets there, it is out of your hands. Given suitable tools (wget is enough for most things), anyone can copy and modify your content in any way they like. Similarly, whatever can be shown on the screen inside a browser can be captured and saved. There is no way around it. If you don't want your content copied, don't send it.
Forget about it. You will irritate your end users who will find a way to con you and do what you didn't want them to do. Forbidden fruit is always the sweetest. By telling them explicitly "you cannot do this", they will wonder why do you want to guard your content and they might try even harder to do stuff you otherwise wouldn't want them to do.
Psychology and technology are against you in this case.
Printing
You could disable printing (well sort of - it's not 100% effective) using a "print" style sheet.
I have not tried it myself, but here is a link that could get you started: http://webdesign.about.com/od/advancedcss/qt/block_print.htm
Print screen
Print screen is something that is typically controlled by the operating system not the browser nor webpage. So you are unlikely to be able to stop this. However, casting my mind back I remember a time (perhaps a long long time ago), where you couldn't take screen shots in Windows (maybe Windows 98) of videos... so if your really in need of disabling print screen - perhaps you could perhaps encode your content in a video... but this will have many many downfalls - namely accessibility, search engine optimisation and it being a royal pain to do... so I wouldn't recommend it under any circumstance.
Right screen
Right click you can disable, but not using a server-side technology (such as ASP.net) instead in a client-side technology such as javascript. A quick search in your favourite search engine will find some help. But disabling right click is rudimentary to get around, so it is not full proof.
An alternative to protect your content is to possibly investigate "rights" in PDFs. I believe you can disable the "right" to print.
However none of these solutions are going to be full-proof. As long as you are making your content available to an end-user on their own computer, there is always going to be a way around your restrictions.
I have implemented for disabling printing using window.onbeforeprint()
Refer this Answer

How does YouTube prevent video content from being saved/redistributed?

Sure, you can embed a YouTube video on any site, but the content ultimately must come from their server. What technology(ies) do they have that prevents us from saving/redistributing content?
From a protocol standpoint, you would think that anything that comes over the wire could be saved. I hope I am not the only guy on Earth who does not know how to "save" a YouTube video...
There are a couple of plugins for Firefox out there that let you save the content. Basically it parses the sourcecode and looks for the videofile (either .flv or .mp4) and downloads that directly. The flash player on the page just plays the supplied file. They could of course obfuscate the path to the video file, but that can be reverse engineered as well. They can't really do anything about it, because the video file has to be on the user's computer at some point, or if not, the stream could be intercepted as well.
eg. https://addons.mozilla.org/de/firefox/addon/6584/?src=api
Mostly it's a legal deterrent rather than technical. There are a plethora of programs out there that will allow you to download their video. But there are two things they do that help reduce unauthorized downloads:
Use is flash to control the download and playback.
Hosting video yourself is not cheap, and thus it's much easier to simply leave the video on youtube.
They don't do anything about it. Very likely your Flash viewer downloads a copy and puts in somewhere on your harddrive (under my Linux system with Firefox and Adobe Flash in /tmp). After you are done viewing the file is removed to save disk space, but since it is on your harddrive nothing prevents you from making a copy elsewhere.
You might want to look at the 'analogue hole', in the end, data still has to be displayed on your screen, or get through your speakers and what not. It's always theoretically possible to intercept it at that point, or even just record your audio-out into another machine.
So as far as the analogue hole goes, the only solution is to skip that, in this form:
(source: thisdomainisirrelevant.net)
Which is not that marketable.

Resources