I'm trying to use Grunt to clean up a large project. For this specific example, I am trying to run unit tests and want to do so only for paths under the current grunt execution directory (i.e., the result of pwd).
I want one Gruntfile at the project root. I know grunt will find and execute this with no problem from any subdirectory. If I define my test runner options to look in "test/", it only runs tests under {project root/}test/. Is there a way to tell a project-level Gruntfile to make its paths (in all or in part) relative to the executing location?
Notes:
I don't need to be told "Why would you do this? Grunt should manage your whole project!" This is a retrofit, and until that halcyon day when it all works, I want/need it piecemeal.
To reiterate, "**/test/" isn't the answer, because I want only the tests under the current grunt execution directory.
--base also won't work, because Grunt will look for the Node packages at the base location.
I have, for similar situations, used a shared configuration JSON file that I've imported with grunt.config.merge(grunt.file.readJSON("../grunt-shared.json"));. However, that requires Gruntfiles in subfolders, as well as a hard-coded path to the shared file (e.g., ../), which seems tenuous.
I could write code to do some directory climbing and path building, but I'd like to make that a last resort.
Here's the solution I came up with (H/T to #firstdoit, https://stackoverflow.com/a/28763634/356016):
Create a single, shared JavaScript file at the root of the project to centralize Grunt behavior.
Each "sub-project" directory has a minimal, boilerplate Gruntfile.js.
Manually adjust Grunt's file base in the shared file to load from one node_modules source.
Gruntfile.js
/**
* This Gruntfile is largely just to establish a file path base for this
* directory. In all but the rarest cases, it can simply allow Grunt to
* "pass-through" to the project-level Gruntfile.
*/
module.exports = function (grunt)
{
var PATH_TO_ROOT = "../";
// If customization is needed...
// grunt.config.init({});
// grunt.config.merge(require(PATH_TO_ROOT + "grunt-shared.js")(grunt));
// Otherwise, just use the root...
grunt.config.init(require(PATH_TO_ROOT + "grunt-shared.js")(grunt));
};
Using a var for PATH_TO_ROOT is largely unnecessary, but it provides a single focus point for using this boilerplate file across sub-projects.
{ROOT}/grunt-shared.js
module.exports = function (grunt)
{
// load needed Node modules
var path = require("path");
var processBase = process.cwd();
var rootBase = path.dirname(module.filename);
/*
* Normally, load-grunt-config also provides the functionality
* of load-grunt-tasks. However, because of our "root modules"
* setup, we need the task configurations to happen at a different
* file base than task (module) loading. We could pass the base
* for tasks to each task, but it is better to centralize it here.
*
* Set the base to the project root, load the modules/tasks, then
* reset the base and process the configurations.
*
* WARNING: This is only compatible with the default base. An explicit base will be lost.
*/
grunt.file.setBase(rootBase);
require("load-grunt-tasks")(grunt);
// Restore file path base.
grunt.file.setBase(processBase);
// Read every config file in {rootBase}/grunt/ into Grunt's config.
var configObj = require("load-grunt-config")(grunt, {
configPath: path.join(rootBase, "grunt"),
loadGruntTasks: false
});
return configObj;
};
Related
My goal is to use the pyodide package in a Next.js project. Pyodide works like this:
import * as pyodideModule from "pyodide";
// ...
const pyodide = await pyodideModule.loadPyodide({
indexURL: "/pyodide-data",
});
The Pyodide client uses the indexURL to essentially do this:
fetch(indexURL + "/pyodide.asm.data")
fetch(indexURL + "/repodata.json")
fetch(indexURL + "/pyodide.asm.wasm")
fetch(indexURL + "/pyodide_py.tar")
fetch(indexURL + "/pyodide.asm.js")
These files are supplied in the pyodide npm package:
$ find node_modules/pyodide/py*
node_modules/pyodide/pyodide.asm.data
node_modules/pyodide/pyodide.asm.js
node_modules/pyodide/pyodide.asm.wasm
node_modules/pyodide/pyodide.d.ts
node_modules/pyodide/pyodide.js
node_modules/pyodide/pyodide.js.map
node_modules/pyodide/pyodide.mjs
node_modules/pyodide/pyodide.mjs.map
node_modules/pyodide/pyodide_py.tar
So I need to make those files part of my Next.js build.
I could do the dumb thing of manually copying the files into my public/ directory, but this means I need to ensure the versions are the same, and it pollutes my version control. I would much rather the build system do this for me.
I found this approach which uses copy-webpack-plugin to copy the files into my public/ directory, but this still pollutes my version controlled project-specific files in public/.
I've also tried copying to .next/static/pyodide-data and then telling Pyodide to load its files from the base URL "/_next/static/pyodide-data". This works, but feels very hacky - I'm using the private _next namespace.
What's the official, non-hacky Next.js way to make node_modules files available for dynamic fetching from the client?
When I run the command yarn build, some files are created in the public/build directory, generated files get a new filename containing a random hash string :
For only files themes/light and theme/dark I need to remove automatically the random hash string when I run the yarn build, I want to keep the original filename. I mean, currently the command generate theses files :
public/build/themes/light.3ac94fb2.css
public/build/themes/dark.064ff2f6.css
And instead, I want to have :
public/build/themes/light.css
public/build/themes/dark.css
Is it possible to do that automatically ?
If you don't mind having the files twice, one with the hash, one without, then the copyFiles plugin is probably the way to go:
This work adding, in your Encore configuration in webpack.config.js something like:
Encore
// Your usual config comes here
.copyFiles([
{
from: './assets/themes/light.css',
// or wherever the file lives in the assets folder
to: 'public/build/themes/light.css'
},{
from: './assets/themes/dark.css',
// or wherever the file lives in the assets folder
to: 'public/build/themes/dark.css'
}
])
;
Notice: untested – find more explanation here: https://symfonycasts.com/screencast/webpack-encore/copy-files
There is also a configureFilenames function, but I am not sure you'll be able to fit conditionals in it.
I'm trying to create app based on Jetty 9.4.20 (embedded) and Vaadin Flow 14.0.12.
It based on very nice project vaadin14-embedded-jetty.
I want to package app with one main-jar and all dependency libs must be in folder 'libs' near main-jar.
I remove maven-assembly-plugin, instead use maven-dependency-plugin and maven-jar-plugin. In maven-dependency-plugin i add section <execution>get-dependencies</execution> where i unpack directories META-INF/resources/,META-INF/services/ from Vaadin Flow libs to the result JAR.
In this case app work fine. But if i comment section <execution>get-dependencies</execution> then result package didn't contain that directories and app didn't work.
It just cannot give some static files from Vaadin Flow libs.
This error occurs only if i launch packaged app with ...
$ java -jar vaadin14-embedded-jetty-1.0-SNAPSHOT.jar
... but from Intellij Idea it launch correctly.
There was an opinion that is Jetty staring with wrong ClassLoader and cannot maintain requests to static files in Jar-libs.
The META-INF/services/ files MUST be maintained from the Jetty libs.
That's important for Jetty to use java.util.ServiceLoader.
If you are merging contents of JAR files into a single JAR file, that's called a "uber jar".
There are many techniques to do this, but if you are using maven-assembly-plugin or maven-dependency-plugin to build this "uber jar" then you will not be merging critical files that have the same name across multiple JAR files.
Consider using maven-shade-plugin and it's associated Resource Transformers to properly merge these files.
http://maven.apache.org/plugins/maven-shade-plugin/
http://maven.apache.org/plugins/maven-shade-plugin/examples/resource-transformers.html
The ServicesResourceTransformer is the one that merges META-INF/services/ files, use it.
As for static content, that works fine, but you have to setup your Base Resource properly.
Looking at your source, you do the following ...
final URI webRootUri = ManualJetty.class.getResource("/webapp/").toURI();
final WebAppContext context = new WebAppContext();
context.setBaseResource(Resource.newResource(webRootUri));
That won't work reliably in 100% of cases (as you have noticed when running in the IDE vs command line).
The Class.getResource(String) is only reliable if you lookup a file (not a directory).
Consider that the Jetty Project Embedded Cookbook recipes have techniques for this.
See:
WebAppContextFromClasspath.java
ResourceHandlerFromClasspath.java
DefaultServletFileServer.java
DefaultServletMultipleBases.java
XmlEnhancedServer.java
MultipartMimeUploadExample.java
Example:
// Figure out what path to serve content from
ClassLoader cl = ManualJetty.class.getClassLoader();
// We look for a file, as ClassLoader.getResource() is not
// designed to look for directories (we resolve the directory later)
URL f = cl.getResource("webapp/index.html");
if (f == null)
{
throw new RuntimeException("Unable to find resource directory");
}
// Resolve file to directory
URI webRootUri = f.toURI().resolve("./").normalize();
System.err.println("WebRoot is " + webRootUri);
WebAppContext context = new WebAppContext();
context.setBaseResource(Resource.newResource(webRootUri));
I'm trying to set an environment variable for an API key that I don't want in my code. My source javascript looks something like this :
.get(`http://api-url-and-parameters&api-key=${process.env.API_KEY}`)
I'm using webpack and the package dotenv-webpack https://www.npmjs.com/package/dotenv-webpack to set API_KEY in a gitignored .env file and it's all running fine on my local. I'd like to also be able to set that variable when deploying through Netlify, I've tried adding it through to GUI to the 'build environment variables', and also to set it directly in the build command, but without success.
Any idea what might be the issue ?
WARNING: If this is a secret key, you will not want to expose this environment variable value in any bundle that gets returned to the client. It should only be used by your build scripts to be used to create your content during build.
Issue
dotenv-webpack expects there to be a .env file to load in your variables during the webpack build of your bundle. When the repository is checked out by Netlify, the .env does not exist because for good reason it is in .gitignore.
Solution
Store your API_KEY in the Netlify build environment variables and build the .env using a script prior to running the build command.
scripts/create-env.js
const fs = require('fs')
fs.writeFileSync('./.env', `API_KEY=${process.env.API_KEY}\n`)
Run the script as part of your build
node ./scripts/create-env.js && <your_existing_webpack_build_command>
Caveats & Recommendations
Do not use this method with a public facing repository [open] because any PR or branch deploy could create a simple script into your code to expose the API_KEY
The example script above is for simplicity so, make any script you use be able to error out with a code other than 0 so if the script fails the deploy will fail.
You can set Dotenv-webpack to load system environment variables as well as those you have declared in your .env file by doing the following:
plugins: [
new Dotenv({
systemvars: true
})
]
I.e Setting the systemvars attribute of your webpack dotenv plugin to true.
Note that system environment variables with the same name will overwrite those defined in your .env file.
Source: https://www.npmjs.com/package/dotenv-webpack#properties
if you go to corresponding site's settings in Netlify, under build&deploy you can find a section called environment variables you can easily add your environment variables from there. if you add MY_API_KEY variable to environment variables you will be able to access it inside your project via process.env.MY_API_KEY.
If you're using Nuxt JS there is a more "straight forward" approach.
Just edit the nuxt.config.js like so:
module.exports = {
env: {
GOOGLE_API_KEY: process.env.GOOGLE_API_KEY
},
// ...
Then add the GOOGLE_API_KEY to Netlify through the build environment variables as usual.
Credit goes to yann-linn and his answer on github.
What you can also do is also to define a global constant in Webpack. Netlify environment variables defined in UI will work with it. You don't need dotenv or dotenv-webpack.
webpack.config.js
const webpack = require("webpack");
module.exports = {
plugins: [
new webpack.DefinePlugin({
"process.env.API_KEY": JSON.stringify(process.env.API_KEY)
}),
]
}
However again, of course you shouldn't do it just inputting enviornmental variables in the frontend if your API key is confidential and project public. The API key will appear in the source code of the website and will be easily accessible for everyone visiting it. Lambda function would be a better option.
You can use the Netlify's config file also ...
You can find documentation here.
Also i wanted to have the same ENV variables with with different values per branch/environment.
This workaround worked for me:
Create a netlify.toml file like:
[build]
NUXT_ENV_BASE_API = "/api"
NUXT_ENV_HOST_DOMAIN = "https://your-domain.gr"
[context.branch-deploy]
environment = { NUXT_ENV_BASE_API = "/dev-api", NUXT_ENV_HOST_DOMAIN = "https://dev.your-domain.gr" }
[context.production]
environment = { NUXT_ENV_BASE_API = "/api", NUXT_ENV_HOST_DOMAIN = "https://your-domain.gr" }
And deploy in Netlify ...
I've installed Less via npm like this
$ npm install -g less
Now every time that I want to compile source files to .css, I run
$ lessc styles.less styles.css
Is there any way via the command line to make it listen when I save the document to compile it automatically?
The best solution out there I've found is the one recommended on the official LESS website: https://github.com/jgonera/autoless. It is dead simple to use. Also it listens to the changes in the imported files to compile.
Have a look at this article:
http://www.hongkiat.com/blog/less-auto-compile/
It offers GUI solutions (SimpLESS, WinLESS, LESS.app, and ChrunchApp) and a node solution. (deadsimple-less-watch-compiler)
Are you using less alone or with Node.JS ? Because if you are using it with node, there are easy ways to resolve this problem. The first two I can think of are (both these solutions go in your app.js) :
using a middleware, like stated in this stack overflow discussion
var lessMiddleware = require('less-middleware');
...
app.configure(function(){
//other configuration here...
app.use(lessMiddleware({
src : __dirname + "/public",
compress : true
}));
app.use(express.static(__dirname + '/public'));
});
another method consists of making a system call as soon as you start your nodeJS instance (the method name may differ based on your NodeJS version)
// before all the treatment is done
execSync("lessc /public/stylesheets/styles.less /public/stylesheets/styles.css");
var app = express();
app.use(...);
In both cases, Node will automatically convert the less files into css files. Note that with the second option, Node was to be relaunched for the conversion to happen, whereas the first option will answer your need better, by always checking for a newer version in a given directory.