webpack intercept/hook into async importing - asynchronous

Webpack provides a number of ways to asynchronously load modules (require([], resolve), import('...'), require.ensure). Using those functions makes webpack to automatically split resulting bundle into chunks containing required modules.
And the question is - how do I hook in whenever webpack tries to load another bundle chunk and show some loading animation (eg nprogress.start()) and then hide this animation (nprogress.done()) when the chunk is loaded?
I tried to search for webpack hooks, but all I could find was related to the compilation phase and not runtime (eg this)

Related

How to properly do tree shaking to reduce bundle size and separate entry point for each cloud function

I am using Google Firebase Cloud Functions with TypeScript, and I found out that even though each function is deployed separately, they all share the same bundles and dependencies, even if some functions do not use them nor import them.
In my case, one cloud function uses Redis and others don't. I have 10 functions. All 10 functions actually end up importing redis related code even though they don't import them.
Since all functions share the same entry point, index.js. It currently seems it's impossible to have separate tree-shaken bundles / entry points for each function.
This is very inefficient in terms of bundle size / cold start timing / memory / etc. It also means as I have more and more functions, bundle size will grow for all functions together. It's not scalable.
Is there any way to not share the entry point, index.js, and have completely separate bundles by using bundlers like webpack?
You can create a different local Firebase working area (with firebase init) for each function that should deploy in isolation from the others. You will have to instruct the CLI not to overwrite the other functions on deployment using the --only functions:yourFunctionName to deploy it.
Or, you can deploy function using Cloud tools (gcloud) instead of Firebase tools, but you won't be able to use firebase-functions and its TypeScript bindings.
Or, you can lazily load your modules instead of statically loading them at the global scope of your functions, as described in this video.
I don't recommend using webpack. It's not going to be worth your time to get it configured.
You might try better-firebase-functions, which solves this elegantly by automatically lazy loading only the function that is currently invoked by checking the environment variable FUNCTION_NAME - see https://link.medium.com/4g3CJOLXidb

Can Meteor 1.5 dynamic importing be done on multiple scripts simultaneously?

Meteor 1.5 was recently released, and it has... dynamic imports!
All the examples I see are for dynamically importing one file, and then running code for it. I'm curious if there is a way to dynamically import multiple files with one declaration?
Thanks. :)
Depends on what you mean by "import multiple files"…
Whenever you request a dynamic import, Meteor will fetch the required module and all its dependencies which are not already available in the browser cache.
On the contrary of classic HTTP request, the downloading is not parallelized, but goes through the already available Websocket. So all modules go through a pipe / queue.
If your code depends on multiple modules, you could simply write a "parent" module.
Or you could also aggregate your dynamic imports with Promise.all, since they return Promises.

GruntJS and custom task: require a RequireJS module

I am quite new with GruntJS and I wonder if it is possible to have a task that loads some RequireJs modules to process them and write the result within a JS file.
I describe my scenario:
I have a RequireJs based project with many files.
I would like to concatenate/minify/etc the project to deploy it and increase performances, etc.
The optimization works perfectly with the grunt-contrib-requirejs plugin.
The grunt-contrib-requirejs plugin works with a main.js file and I should need to generate it dynamically.
I would like to generate the main.js processing some RequireJS module of the project (call them fileA.js and fileB.js).
I would like to use the generated main.js to run the grunt-contrib-requirejs plugin.
So the task sequence would be something like:
Custom Task:
loads fileA.js and fileB.js
merge them together
write the result of the merging within a new JS file
grunt-contrib-requirejs task:
use the generated main.js file to optimize the project
Do you know how can I achieve this?
I don't have any kind of restrictions on the way/tools/libs to use.
You can load RequireJS in Grunt, as follows:
var requirejs = require('requirejs');
You can then fetch all the fileX.js files in your tree through Grunt:
grunt.file.recurse('js/modules/', function callback(abspath, rootdir, subdir, filename) {
if (filename === 'fileX.js') {
/* Do something here. */
}
}
Once you have all the modules you need you can use r.js to minify/concatenate them.

Building multiple outputs through the same build process with external config

I'm trying to leverage GruntJS to create a build process that is uniform across multiple teams and projects at my company. The idea hear is that we have a config file for each application that only specifies the files that need to be processed and what bundles they need to be concatenated into at the end. The build process would be the same for all apps: pick up the config for the app, process files in each bundle using a uniform build process.
For Example:
asset.json config file specifies two bundles, "main" with 1.js + 2.js and "secondary" with 2.js and 3.js
Build process says for each bundle, preprocess, minify, then concatenate into a js file based on the bundle
Get output of "main.js" and "secondary.js"
The problem I'm running into is that Grunt takes a "static" configuration and executes it. I've already abstracted out the building of the configuration so that I can add chunks dynamically, but right now I don't see a better way forward than literally looping over each bundle and building out a unique task for each section of the build process for each bundle, building up queues of tasks to execute, and then running each task in the queues during the build process. Its definitely possible, but its a lot of manual work and seems prone to breaking. Is there way to just execute each task in order as I loop over the bundles? Any better way to achieve the same net result of config + source in, N bundles out?
I want to be clear that I am fully aware that Grunt CAN build multiple files. What I'm trying to do is separate the specification of how many bundles from the build steps themselves. Grunt core has to bake these two things together which means each project would have to go in and alter their build steps rather than an external configuration. As per the example above, I should be able to swap out the asset.json file specified in step 1 for any config file that has 1, 2, 3, ... N bundles with N files in each one (and potentially specifying a "type" like scripts or styles).
Edit 10/12/13: The Nitty Gritty posted an article yesterday that might be another approach to tackling your issue.
This can be done by passing the module name you want to build as a command line argument and loading in the whole assets file in your grunt config. Please note this is example code, I have not tested this, so it's possible you need to set paths etc. correct for your case.
Start with updating the assets.json file to a plain JavaScript file, and reform it like so:
module.exports = {
main: ["1.js", "2.js"],
secondary: ["2.js","3.js"]
}
Next, you can pass a command line argument to Grunt, which should specify one of the module names in assets.js. Example:
grunt --bundle=main
Now, you'll need to load in the assets.js file in the Gruntfile:
var assets = require('./assets'); // assuming assets.js is on the same level as your Gruntfile
And then you can get the argument name by using:
var bundle = grunt.option("bundle");
Now you can use bundle as your output file name and assets.bundle to get the array files for that bundle.

RequireJS with multiple pages -- using optimizer

I have my JavaScript organized as described here: https://stackoverflow.com/a/10816983/83897.
I have a JavaScript-heavy ASP.NET application that has multiple different pages (vs. being a single-page application). Each page has different dependencies, so I have a per-page .js file (page1.js, page2.js, etc.). Each has a require() call, declaring its dependencies:
require(['jquery', 'page1Module'], function($, module){
// page1 specific stuff here
});
This works fine. What I'm wondering is, how might the RequireJS build process work? I think I want a per-page "build" .js file (e.g. page1-build.js, page2-build.js, etc.)? Is there existing software I can leverage?
The process might look like this:
Compile all dependencies for a given script into one build.js file in a temporary directory.
Calculate an MD5 fingerprint for the compiled file.
Compare that fingerprint with the comparable file in public/assets.
Create an in-memory RequireJS manifest, mapping each module to the compiled file. Append this manifest to the compiled file.
Somehow make production use the build file.
EDIT: After some thought, I'm thinking the RequireJS optimization using node + r.js will just be part of a larger asset building process, where the asset building relies on some other, third-party library. The RequireJS optimization will simply be used for certain JavaScript dependencies (i.e. the JavaScript files for each page, including found dependencies), perhaps specified in some XML config.
You can create multiple optimized files by specifing the modules in the build profile:
{
modules: [
{
name: "main"
},
{
name: "page1",
include: ["dep1", "shim2"],
exclude: ["main"]
}]
}
Each entry will generate a optmized .js file.
More info here: How to use RequireJS build profile + r.js in a multi-page project

Resources