Local URL fallbacks for offline development - css

I often work from remote, in the train or on places where I don't have any or a stable internet connection. Our app loads some fonts, CSS and JS from different CDNs (google and microsoft). When I'm offline I don't have access to this files and can't work properly.
Even worse, when I have a bad internet connection, my browser waits till it runs in a timeout, and this slows down everything.
Is there a solution where I can set up a local fallback for some URLs and server this content when no internet connection is available?
I'm on OS X, and maybe there is some proxy stuff out there I don't know which can handle such a thing. btw: HTTP would be enough, so no dealing with SSL would be necessary for development.

There's a great answer to a similar question on the webmasters StackExchange site. In short, you can use Charles Proxy to redirect certain requests to a local file. Should work well, as long as it's not a massive list of assets you have (or dynamic requests).
Alternatively, you could just use a build script of some sort (depends on your toolchain) to rewrite the asset URLs to local versions (and of course make sure they're pointing to the proper versions when committing code).

Related

How should be the HTTP timing?

This is the graph of one of my sites https://www.alebalweb-blog.com, first line of firefox development tools -> Network, and I'm not sure that the blocked and waiting entries are "normal".
Waiting, I suspect it's the server's fault, it's a small vps on Vultr - Ubuntu 18.04, the other day I updated to php7-4-fpm and I haven't activated opcache, memcached, acpu or anything else yet, because (unfortunately) my sites are small, less than a thousand visits a day, and I don't know if it makes sense to activate chace systems, maybe they also affect indexing and positioning on search engines?
Even if yandex and bing give a lot of work for my little server... and maybe just them would take advantage from the cache?
Blocked, it is more confusing, I'm not sure it's me, everything happens before you get to my server? Maybe it's Vultr's fault? Maybe namesilo's fault? (where domains are registered) Maybe mine, some apache configuration or something else? Maybe they're normal values? I have no idea.
Can anyone help me understand if they are normal values? And if they are not, to understand how I can improve?
-------------------------update------------------------
I have read the pages you have suggested to me, even they do not seem to have understood much or found a solution....
I did some things on my little server, like: blocked yandex, enabled opcache, installed memcached.
The intent is to stabilize, to begin to understand something.
I have done many other tests these days, and I have seen results like these:
This is another site, but it is on the same server, the one highlighted is matomo (statistics), the tracking javascript script, is in a sub-domain, but always on the same server.
The difference is enormous, and the tests were done within seconds of each other.
So at this point maybe the question is: do you have any suggestions on what else I can do to start understanding something?
At least to understand if to create these timings is me, if it is my server, the scripts of my sites, the browsers, the connection or what else.
None of what you've posted looks very bad, but your service is sometimes taking > 6s to respond to the initial connection request. There are probably a lot of small things wrong that you can fix, I would start with looking at this question which addresses the same problem I'm seeing with your site.
The timing looks bit large as for me.
Seems the server is not responding during 150ms (blocked) especially on main page.
Then takes up to 150ms for TLS setup, 200ms to load content etc.
But this is not stable.
Sometimes it took about 800ms to receive homepage, sometimes the whole thing took less then 200ms.
Most likely it is server issues (as your virtual server share physical hardware machine with other servers).
And just for reference:
What does "Blocked" really mean in the Firefox developer tools Network monitoring?
Also, there is some general things to consider as troubleshoot:
I suggest to create local (localhost) version of the site, then:
Check time actually required to render homepage (inside server log)
Temporary remove gzip compression
Temporary remove https
Temporary remove output buffering in php (hope your code does not need it)
Check if any "post processing" content hooks are active in php

open source CMS and server for video streaming platform

I have to propose a platform that allows streaming video services employing the MPEEG-DASH standard. This platform blocks must be implemented with open source tools. I proposed FFmpeg to encode and MP4Box/GPAC tool for encryption and packaging. For the DRM case my propose is to use Widewine (I didn’t find any other open source tool) which is compatible with dash.js (the player proposed by me), it can be integrated to Chrome and according to CastLabs it’s also compatible with MP4Box. So, I have to select an open source CMS, and at the same time I need it to be compatible with dash.js. I read that it’s possible to add any JavaScript to these CMS, that it’s only necessary to create some modules to do so. I’d like to know which one of the following CMS you suggest me: MediaDrop, Drupal or Wordpress.
I also have some doubts about the server. I know that in order to offer this service it only takes a traditional HTTP server. In a first moment I chose Nginx over Apache because the latter presents some problems associated to performance (the server will receive a large amount of simultaneous requests), nevertheless, I discarded Nginx (Nginx-rtmp module) due to its constraints: it’s only for live streaming (I need the service to be offered also on demand) and the inputs must be RTMP. I found something about Nginx-based VOD packager, do you know if this one can be used as a server to offer live and on demand streaming service?
when it comes to DRM you will need other systems than just Widevine to reach all browser platforms, e.g. PlayReady for IE/EDGE or FairPlay with HLS for Safari. Here you can find a overview of the DRM systems for the different browsers: https://bitmovin.com/player-drm-support/
When you already use ffmpeg + MP4Box to encode and package the content, you don't need a dedicated VoD packager support on your webserver, you can just the DASH/HLS content on the HTTP Webserver. Here you can find a tutorial for x264 + MP4Box, maybe that's useful: https://bitmovin.com/mp4box-dash-content-generation-x264/

Alternative Java applet network drive access

Chrome is on the verge of definitly break compatability with NPAPI, and IE breaking with ActiveX the future of Java Applets is dark. Currenty we actively use a secure applet for out client organizations that enables their users to upload a bunch of files from their file system to our servers with the click of a button. The applet has full access to any configured drive, including network drives.
With the imminent death of the applet this functionality is going to be lost if we don't find an alternative. I have already tried to explore different solutions, including the chrome FileSystem API but that is currently only available for Chrome (http://caniuse.com/#feat=filesystem) and has limited access.
Does anybody know about an alternative to keep supporting the much appreciated functionality? Unfortunately we are obligated to support all browser down to IE8.
I've written a post about this here.
Once Google Chrome was the first to announce that they won’t be supporting NPAPI anymore, they were also the first to provide a new architecture in order to rewrite your code to work on their browser. You can take a look on Native Messaging, which “can exchange messages with native applications using an API that is similar to the other message passing APIs”. The problem is that this approach only works on Chrome, is not something that you can adapt to other browsers.
A more useful approach is FireBreath, a browser plugin in a post NPAPI world. Check the words below from one buddy of the project:
“FireBreath 2 will allow you to write a plugin that works in NPAPI, ActiveX, or through Native Messaging; it’s getting close to ready to go into beta. It doesn’t have any kind of real drawing support, but would work for what you describe. The install process is a bit of a pain, but it works. The FireWyrm protocol that the native messaging component uses could be used with any connection that allows passing text data; it should be possible to make it work with js-ctypes on firefox or plausibly WEB-RTC or even CORS AJAX in some way. For now the only thing we needed to solve was Chrome, but we did it in a way that should be pretty portable to other technologies.”
In light of the answer provided by Uly Marins I have researched the options suggested. Unfortunately these options weren't viable for our application, because the mayority of our users do not have sufficient rights to install third party plugins. Additionally the API is still in Beta which won't do any good in a stable production environment.
The main problem we wanted to solve was the abbility to delete files from the accessed folders. It seemed like one of the mayor goals of the removal of the NPAPI support was exactly to prevent this kind of possibility. Therefore we needed to reduce our goals to a simple solution that was still acceptable for our users, with the additional training on how to clear the selected folder manually (because most of our users are almost computer illiterate and needed to access network folders).
Long answer short. The requested solution is just not possible anymore and had to be replaced by a simpler solution and additional training.

Broken caching of images loaded from homemade webserver

A while ago I wrote a webserver which I'm using on a site of mine.
When I navigate to another page in Chrome while the images from this homemade webserver are still loading, they stay cached as only half-loaded.
Is this a known bug in Chrome, or an issue with my implementation of the HTTP protocol?
My webserver uses E-Tags for caching.
First Rule of Programming: It's your fault
Start with your code, and investigate
further and further outward until you
have definitive evidence of where the
problem lies.
You need to apply this rule here. What are the chances that Chrome, when communicating with Apache, is likely to exhibit this kind of bug deep into it's 6 (at least) major iteration?
I would put a traffic analyser onto your server and view the exchanges carefully. Next I would compare them with those from a well-established web server like Apache and note any differences.

Large file download in background, initiated from the browser

Is there any reasonable method to allow users of a webapp to download large files? I'm looking for something other than the browser's built-in download dialog - the requirements are that the user initiates the download from the browser and then some other application takes over, downloads the file in background and doesn't exit when the browser is closed. It might possibly work over http, ftp or even bittorrent. Platform independence would be a nice thing to have but I'm mostly concerned with Windows.
This might be a suitable use for BitTorrent. It works using a separate program (in most browsers), and will still run after the browser is closed. Not a perfect match, but meets most of your demands.
Maybe BITS is something for you?
Background Intelligent Transfer
Service Purpose
Background Intelligent Transfer
Service (BITS) transfers files
(downloads or uploads) between a
client and server and provides
progress information related to the
transfers. You can also download files
from a peer.
Where Applicable
Use BITS for applications that need
to:
Asynchronously transfer files in the
foreground or background. Preserve
the responsiveness of other network
applications. Automatically resume
file transfers after network
disconnects and computer restarts.
Developer Audience
BITS is designed for C and C++
developers.
Windows only
Try freeDownloadManager. It does integrate with IE and Firefox.
Take a look at this:
http://msdn.microsoft.com/en-us/library/aa753618(VS.85).aspx
It´s only for IE though.
Another way is to write a BandObject for IE, which hooks up on all links and starts your application.
http://www.codeproject.com/KB/shell/dotnetbandobjects.aspx
Depending on how large the files are, pretty much all web-browsers all have built-in download managers.. Just put a link to the file, and the browser will take over when the user clicks.. You could simply recommend people install a download manager before downloading the file, linking to a recommended free client for Windows/Linux/OS X.
Depending on how large the files are, Bittorrent could be an option. You would offer a .torrent file, when people open them in a separate download-client, which is seperate from the browser.
There are drawbacks, mainly depending on your intended audience:
Bittorrent is rarely allowed on corporate or school networks
it can be difficult to use (as it's a new concept to lots of people).. for example, if someone doesn't have a torrent client installed, they get a tiny file they cannot open, which can be confusing
problems with NAT/port-forwarding/firewalls are quite common
You have to use run a torrent tracker, and seed the file
...but, there are also benefits - mainly reduced bandwidth-usage on the server, as people download also seed the file.

Resources