Use WebAssembly module compiled with Emscripten in Next JS - next.js

I am trying to build a Next JS project with an imported WebAssembly module compiled using Emscripten.
The problem seems to be related to the WebPack loader being unable to import the .wasm module via the generated .js.
I would greatly appreciate some help.
My C++ file hello.cpp:
#include <math.h>
extern "C" {
int int_sqrt(int x) {
return sqrt(x);
}
}
I compile using:
em++ hello.cpp -o "../v2/lib/wasm/test.js" -s MODULARIZE -s WASM=1 -s EXPORT_NAME="SZU" -s ENVIRONMENT="web" -s EXPORTED_FUNCTIONS=_int_sqrt -s EXPORTED_RUNTIME_METHODS=ccall,cwrap
I try to use the module in one of the components like so:
import SZU from "../../../../../../wasm/test";
var int_sqrt = SZU().cwrap("int_sqrt", 'number', ['number'])
When I run npx next build the build fails with:
Error: not compiled for this environment (did you build to HTML and try to run it not on the web, or set ENVIRONMENT to something - like node - and run it someplace else - like on the web?)
I expect to be able to build the project using npx next build then start the server with npx next start and be able to interact with functions exported in the .wasm/.js module files generated by Emscripten.

With the help of #lab I realized my approach was wrong. This is the way I made my WASM module work:
This is my hello.cpp file. Important notice the extern "C" {} block:
#include <math.h>
extern "C" {
int int_sqrt(int x) {
return sqrt(x);
}
}
I compile using em++ the destination is the public folder in next. We need to serve the .wasm and .js statically like #lab pointed out:
em++ hello.cpp -o "../v2/public/wasm/test.js" -s MODULARIZE -s WASM=1 -s EXPORT_NAME="SZU" -s ENVIRONMENT="web" -s EXPORTED_FUNCTIONS=_int_sqrt -s EXPORTED_RUNTIME_METHODS=ccall,cwrap
I load the Emscripten generated js file using the next/script tag like so:
<Script src="/wasm/test.js" strategy='beforeInteractive'/>
In the Component I want to use my WASM module I load it inside the next/useEffect hook, wrap it with Emscripten cwrap and store the function in a reference object using next/useRef like so:
useEffect(() => {
if(typeof (window as any).SZU === 'function' && WASM.current === null){
console.log("loading WASM!");
(window as any).SZU()
.then((wasm: WASM) => {
console.log("got WASM!");
WASM.current = wasm;
int_sqrt.current = WASM.current.cwrap("int_sqrt", "number", ["number"]);
});
}
}, []);
I can call the function later like so:
console.log(`Square root of 16 is ${int_sqrt.current(16)}`);
I hope this helps someone who had the same problem.

The case for wasm is that it will compile into 2 files: a js file and a wasm file. The wasm file cannot be bundled into your app. You must serve it as a static asset (since it's basically binary code), and configure your path to align with your desired import.
Also you might want to ensure it work first. Try writing a nodejs script that import it and give it a test run. OR you can try import it into a nextjs API route, and invoke it via fetch from client-side and see if your compiled wasm work.

Related

Deno: How to allow read permission for directory/folder

Looking at https://deno.land/manual/examples/file_system_events
which is the code below.
const watcher = Deno.watchFs("./src");
for await (const event of watcher) {
console.log(">>>> event", event);
// { kind: "create", paths: [ "/foo.txt" ] }
}
However, when I try to --allow-read permissions I get the error: Is a directory
deno run --allow-read src/ main.ts
error: Is a directory (os error 21)
How do I ensure that the explicit permission --allow-read is permitted for the specified /src folder?
I know that I can use -A to --allow-all, however, I want to be explicit to the allowed permission.
I guess I found the problem. Well first of all you need to use a = to add the allowed paths, like so:
deno run --allow-read=src/ main.ts
But it'll still won't work and it seems to be a bug/enhancement.
On your script you need to provide the absolute path and then it will be effective:
const watcher = Deno.watchFs("<ABSOLUTE_PATH>/src");
For me it's an issue on the Deno.watchFs() method and I opened up one on Github:
https://github.com/denoland/deno/issues/5742

Building library with imports from another library using NX Monorepo

Here is the case. I am using Nrwl NX Monorepo. I have 2 libraries lib-a and lib-b both are publishable libraries created via NX. Now I create a MyClass.ts in lib-a. Naturally under paths in workspace/tsconfig.json paths NX creates an alias to this lib-a ("#workspace/lib-a": ["libs/lib-a/src/index.ts"]). So far so good.
Now we can use this class anywhere within the workspace/monorepo by importing it "import { MyClass } from '#workspace/lib-a';
Unfortunately we can not build lib-b which is importing MyClass. When we try to do it we get the bellow error. So question is how can we build lib-b ?
PS It seems strange that NX monorepo actually don't support such a common scenario linking 2 publishable libs.
"error TS6059: File "d:/workspace/libs/lib-a/src/index.ts" is not under 'rootDir' "d:\workspace\libs\lib-b\src" rootDir is expected to contain all source files"
Try adding
"paths": { "#workspace/*": ["dist/libs/*"] }
into your tsconfig.lib.json files. This should resolve the problem.
Try this solution. Not sure it's official, but in my case it's working well.
3 problems should be resolved:
TypeScript paths
Compiled JS paths
Working directory
First. TypeScript paths is resolved by adding "paths" into workspace/tsconfig.lib.json. NX does it automatically while lib gen. Look answer from Radovan Skendzic.
Second. Problem with compiled JS paths very good described here: Typescript paths not working in an Express project. So you need to install tsconfig-paths into your workspace:
yarn add -D tsconfig-paths
Third. Considering nx run [project]:[target] is working in workspace/ directory you should set CWD to libs/lib-b home directory - to find correct tsconfig.json
So, finally, you have the following executor (add this to your lib-b/project.json) that should work:
"targets": {
"start-dev": {
"executor": "#nrwl/workspace:run-commands",
"options": {
"commands": [
"nodemon -e ts,js --exec ts-node -r tsconfig-paths/register src/index.ts"
],
"cwd": "libs/lib-b"
}
},
...
}
Command to run:
nx run lib-b:start-dev
Don't override "baseUrl" and "paths" in any of child tsconfig!
Put all of your "paths" in tsconfig.base.ts!
Try adding lib-a as an implicit dependency of lib-b, add the line below into the libs/lib-b/project.json file and see what happens:
"implicitDependencies": ["lib-a"]
Running nx graph should show you a graph that should look something like this (do not consider the name of the libraries):
After that you should be able to build both libraries, I hope it works with you as well.

How to call a Meteor method from the command line

I have a Meteor app and want to call a server method from the command line, so that I can write a bash script to perform scheduled operations.
Is there any way to either call a method directly, or submit a form which will then trigger server-side code?
I've tried using curl to call a method, but either it's not possible or I'm missing something basic. This doesn't work:
curl "http://localhost:3000/Meteor.call('myMethod')"
nor does:
curl -s -d "http://localhost:3000/imports/api/test.js" > out.html
where test.js:
var test = function(){
console.log('hello');
}
I thought of using a form but I can't think how to create a submit event because the Meteor client uses template events that then call server methods.
I'll be very grateful for any help! This feels like it should be a simple thing but has me stumped.
Edit: I've also tried phantomjs and slimerjs as run through casperjs.
phantomjs is no longer maintained and generates an error:
TypeError: Attempting to change the setter of an unconfigurable property.
https://github.com/casperjs/casperjs/issues/1935
slimerjs errors with Firefox 60 and I can't figure out how to 'downgrade' back to the supported 59, and the option to disable automatic updates of Firefox no longer seems to exist. The error is:
c is undefined
https://github.com/laurentj/slimerjs/issues/694
You could make use of the node ddp package to call the Meteor method in an own js file that you created with a specific script. From there you can pipe all outs to wherever you want.
Let's assume the following Meteor method:
Meteor.methods({
'myMethod'() {
console.log("hello console")
return "hello result"
}
})
The upcoming steps will let you call this method from another shell, assuming your Meteor application is running.
1. Install ddp in your global npm directory
$ meteor npm install -g ddp
2. Create the script to call your method in your test directory
$ mkdir -p ddptest
$ cd ddptest
$ touch ddptest.js
Place the ddp script code into the file with the editor or command of your choice.
(The follwing code is freely taken from the package's readme. Feel free to configure to your needs.)
ddptest/ddptest.js
var DDPClient = require(process.env.DDP_PATH);
var ddpclient = new DDPClient({
// All properties optional, defaults shown
host : "localhost",
port : 3000,
ssl : false,
autoReconnect : true,
autoReconnectTimer : 500,
maintainCollections : true,
ddpVersion : '1', // ['1', 'pre2', 'pre1'] available
// uses the SockJs protocol to create the connection
// this still uses websockets, but allows to get the benefits
// from projects like meteorhacks:cluster
// (for load balancing and service discovery)
// do not use `path` option when you are using useSockJs
useSockJs: true,
// Use a full url instead of a set of `host`, `port` and `ssl`
// do not set `useSockJs` option if `url` is used
url: 'wss://example.com/websocket'
});
ddpclient.connect(function(error, wasReconnect) {
// If autoReconnect is true, this callback will be invoked each time
// a server connection is re-established
if (error) {
console.log('DDP connection error!');
console.error(error)
return;
}
if (wasReconnect) {
console.log('Reestablishment of a connection.');
}
console.log('connected!');
setTimeout(function () {
/*
* Call a Meteor Method
*/
ddpclient.call(
'myMethod', // namyMethodme of Meteor Method being called
['foo', 'bar'], // parameters to send to Meteor Method
function (err, result) { // callback which returns the method call results
console.log('called function, result: ' + result);
ddpclient.close();
},
function () { // callback which fires when server has finished
console.log('updated'); // sending any updated documents as a result of
console.log(ddpclient.collections.posts); // calling this method
}
);
}, 3000);
});
The code assumes that your app runs on localhost:3000, note that there is no conncection close on errors or undesired behavior.
As you can see at the top, the file imports your globally installed ddp package. Now in order to get it's path without using additional tools, just pass an environment variable (process.env.DDP_PATH) and let your shell handle the path resolving.
In order to get the installation path you can use npm root with the global flag.
Finally call your script via:
$ DDP_PATH=$(meteor npm root -g)/ddp meteor node ddptest.js
Which will give you the following output:
connected!
updated
undefined
called function, result: hello result
And logs hello console to the open session that is running your meteor app.
Edit: A note on using this in production
If you want to use this script in production you have to use the shell commands without the meteor command but using your installation of node and npm.
If you get in trouble with paths use process.execPath to find your node binary and npm root -g to find your global npm modules.
You can check out this documentation: Command Line | meteor shell.
While your meteor app is running, you can execute meteor shell to start an interactive console. In the console, you can do Meteor.call(...).
So if you want to write a script with using meteor shell, you might need to pipe the script file for meteor shell. Like,
$ meteor shell < script_file
See also the answer of "How can I pipe a command into the meteor shell?"

Is there any way to tell angular-cli (for angular 2) to generate minified version of css?

As the title says, when I run "ng serve" angular-cli generates normal css whereas I expect to get the minified version.
Is there any specific setting to use for angular-cli-build, or some additional plugin to install and use?
This is my angular-cli-build.js
var Angular2App = require('angular-cli/lib/broccoli/angular2-app');
module.exports = function(defaults) {
return new Angular2App(defaults, {
vendorNpmFiles: [
'systemjs/dist/system-polyfills.js',
'systemjs/dist/system.src.js',
'zone.js/dist/**/*.+(js|js.map)',
'es6-shim/es6-shim.js',
'reflect-metadata/**/*.+(ts|js|js.map)',
'rxjs/**/*.+(js|js.map)',
'#angular/**/*.+(js|js.map)',
'angular2-cookie/**/*.js'
]
});
};
ng build --prod --env=prod
or
ng serve --prod
Will minify and add a file hash for you.
the --prod tells it to minify hash, and gzip.
the --env=prod tells it to use your prod environment constants file.
which would look like this
You can use
# --env=<your_env>
# --no-sourcemap
# minify => ./minify.js
ng build --env=prod --no-sourcemap && node minify
minify.js
// npm i --save-dev minifier fs-jetpack
const jetpack = require('fs-jetpack');
const path = require('path');
const minifier = require('minifier');
const files = jetpack.list(path.join(__dirname, 'dist'));
console.log(files);
for (const file of files) {
if (/.*(\.js|\.css)$/g.test(file)) {
console.log(`Start ${file}`);
const filePath = path.join(__dirname, 'dist', file);
minifier.minify(filePath, {output: filePath});
}
}
console.log('End');
James' commands DO work and DO minify even when using ng serve --prod.
However you may see something like the following in Chrome and get confused:
That doesn't look minified does it!
Look more carefully and you'll see js:formatted indicating that the pretty print feature was enabled.
Opening the URL http://localhost:4200/main.5082a3da36a8d45dfa42.js directly in a new tab showed me that the CLI was indeed minifying it fully.
You can click the {} icon to turn this feature off, but it seems to like to disappear once the code has been pretty printed so you may need to reload the page and click it quickly.
In 2020 it is just enough to use --prod flag, when building project:
ng build --prod

Deploy QT plugin with its own DLL depences

I have a QT application app.exe and a QT plugin plugin.dll. My plugin.dll depends on many other dynamic libraries (e.g. lib1.dll, lib2.dll and so on). To distribute my project I have this folder structure (ignoring QT libraries):
app.exe
plugins\
plugin.dll
lib1.dll
lib2.dll
lib3.dll
The problem is that there are too many dependences on libX.dll and I want to hide them in a plugin folder, e.g.:
app.exe
plugin\
plugin.dll
lib1.dll
lib2.dll
lib3.dll
But this way libraries libX.dll are "unseen" to my plugin, so that it cannot be loaded. Is there any way to solve this problem?
I am using this code to import libX.dll in plugin.dll's pro-file:
LIBS += -Lpath -l lib1 -l lib2 -l lib3
One of the ways of solving this problem is:
Link all libraries dynamically (at runtime)
Add extra location to search for the libraries
These changes should be done in plugin.dll code:
/* Declare a pointer to import function */
typedef void (*FUNCTION)();
FUNCTION f;
/* Make system search the DLLs in my plugin folder */
// Variable "app" contains directory of the application, not the plugin
QDir app = QDir(qApp->applicationDirPath());
// Combine path
QString plugin_path = app.filePath("plugins/");
// Adding full path for DLL search
SetDllDirectory(plugin_path.toStdWString().c_str());
/* Linking the library */
QLibrary mylib("mylib.dll");
f = (FUNCTION ) mylib.resolve("function");
if (f != NULL)
f(); // You got the function from DLL
else
return; // DLL could not be loaded
This solution has disadvanges:
It is not platform independent (I think you can avoid using SetDllDirectory in UNIX-like systems but I am not sure)
If you import a lot of functions you will have a lot of pointers
Does any one know pure Qt solution?

Resources