Getting TestCafe to recognize dotenv variables - automated-tests

I might be mixing up concepts, but I'd read that it's possible to get TestCafe to recognize variables of the form process.env.MY_COOL_VARIABLE. Also for my Vue.js frontend (built using Vue-CLI, which uses dotenv under the hood), I found I could make a file in .env.test for test values like so:
VUE_APP_MY_COOL_VARIABLE
which I would then access in my test code like so:
test('my fixture', async (t) => {
...
await t
.click(mySelector.find('.div').withText(process.env.VUE_APP_MY_COOL_VARIABLE));
...
}
However, I get the following error:
"text" argument is expected to be a string or a regular expression, but it was undefined.
Seems like my environment variables aren't getting picked up. I build my code like so: vue-cli-service build --mode test.

TestCafe doesn't provide support for .env files out of the box. You can create a test file that will require the dotenv module and load your configuration file:
// enable-dotenv.test.js
require('dotenv').config({ path: '.my.env' });
testcafe chrome enable-dotenv.test.js tests/

Here's how I solved my issue. When debugging, I did a console.log of process.env and noticed that the variable that vue recognizes wasn't visible during testcafe's run. From our package.json:
"test:ui:run": "VUE_APP_MY_COOL_VARIABLE=ui-test yarn build:test && testcafe -a ../ui-test-server.sh chrome",
Also this bit of javascript is run by both the test and mainline code, so I had to use a conditional.
import * as dotenv from 'dotenv';
if (process.env.npm_package_scripts_test_ui_run) { // are we running a testcafe script
dotenv.config({ path: '.env.test' });
}

Have you tried process.env[VUE_APP_MY_COOL_VARIABLE]? It's worth noting that everything in dotenv comes back as a string so you may need to do the casting yourself. For example:
function getEnvVariableValue(envVariable: string) {
// Cast to boolean
if (envVariableValue.toUpperCase() === "TRUE") {
return true;
} else if (envVariableValue.toUpperCase() === "FALSE") {
return false;
// Cast to number
} else if (!isNaN(Number(envVariableValue))) {
return Number(envVariableValue);
} else {
return envVariableValue;
}
}
You can also try creating a .env file in the root folder to see if it picks it that way. I use dotenv in my project directly by including it in the package.json as a dependency and it works this way.

Related

Nextjs with Jest---No tests found, exiting with code 1 Run with `--passWithNoTests` to exit with code 0

import nextJest from 'next/jest'
const createJestConfig = nextJest({
dir: './',
})
// Add any custom config to be passed to Jest
const customJestConfig = {
setupFilesAfterEnv: ['<rootDir>/setupTests.js'],
moduleDirectories: ['node_modules', '<rootDir>/'],
testEnvironment: 'jest-environment-jsdom',
modulePathIgnorePatterns: ['./cypress'],
// testMatch: ['<rootDir>/**/*.test.js', '<rootDir>/**/*.test.jsx'],
}
module.exports = createJestConfig(customJestConfig)
In my project, we use Nextjs application with both Cypress and Jest. The latest jest.config.ts which is recommended is shown above.
If you are now owned this problem. you can maybe try to check your modulePathIgnorePatterns.
I added a ./ to ['cypress'], then it works well. So, I think maybe it just cann't recognize the path.

SQL with Prisma under Electron

My Main goal is to create an Electron App (Windows) that locally stores data in an SQLite Database. And because of type safety I choose to use the Prisma framework instead of other SQLite Frameworks.
I took this Electron Sample Project and now try to include Prisma. Depending on what I try different problems do arrise.
1. PrismaClient is unable to be run in the Browser
I executed npx prisma generate and then try to execute this function via a button:
import { PrismaClient } from '#prisma/client';
onSqlTestAction(): void {
const prisma = new PrismaClient();
const newTestObject = prisma.testTable.create(
{
data: {
value: "TestValue"
}
}
);
}
When executing this in Electron I get this:
core.js:6456 ERROR Error: PrismaClient is unable to be run in the browser.
In case this error is unexpected for you, please report it in https://github.com/prisma/prisma/issues
at new PrismaClient (index-browser.js:93)
at HomeComponent.onSqlTestAction (home.component.ts:19)
at HomeComponent_Template_button_click_7_listener (template.html:7)
at executeListenerWithErrorHandling (core.js:15281)
at wrapListenerIn_markDirtyAndPreventDefault (core.js:15319)
at HTMLButtonElement.<anonymous> (platform-browser.js:568)
at ZoneDelegate.invokeTask (zone.js:406)
at Object.onInvokeTask (core.js:28666)
at ZoneDelegate.invokeTask (zone.js:405)
at Zone.runTask (zone.js:178)
It somehow seems logical that Prisma cannot run in a browser. But I actually build a native app - with Electron that embeds a Browser. It seems to be a loophole.
2. BREAKING CHANGE: webpack < 5 used to include polyfills
So i found this Question: How to use Prisma with Electron
Seemed to be exactly what I looked for. But the error message is different (Debian binaries were not found).
The solution provided is to generate the prisma artifacts into the src folder instead of node_modules - and this leads to 19 polyfills errors. One for example:
./src/database/generated/index.js:20:11-26 - Error: Module not found: Error: Can't resolve 'path' in '[PATH_TO_MY_PROJECT]\src\database\generated'
BREAKING CHANGE: webpack < 5 used to include polyfills for node.js core modules by default.
This is no longer the case. Verify if you need this module and configure a polyfill for it.
If you want to include a polyfill, you need to:
- add a fallback 'resolve.fallback: { "path": require.resolve("path-browserify") }'
- install 'path-browserify'
If you don't want to include a polyfill, you can use an empty module like this:
resolve.fallback: { "path": false }
And this repeats with 18 other modules. Since the error message to begin with was different I also doubt that this is the way to go.
I finally figured this out. What I needed to understand was, that all Electron apps consist of 2 parts: The Frontend Webapp (running in embedded Chromium) and a Node backend server. Those 2 parts are called IPC Main and IPC Renderer and they can communicate with each other. And since Prisma can only run on the main process which is the backend I had to send my SQL actions to the Electron backend and execute them there.
My minimal example
In the frontend (I use Angular)
// This refers to the node_modules folder of the Electron Backend, the folder where the main.ts file is located.
// I just use this import so that I can use the prisma generated classes for type safety.
import { TestTable } from '../../../app/node_modules/.prisma/client';
// Button action
onSqlTestAction(): void {
this.electronService.ipcRenderer.invoke("prisma-channel", 'Test input').then((value) => {
const testObject: TestTable = JSON.parse(value);
console.log(testObject);
});
The sample project I used already had this service to provide the IPC Renderer:
#Injectable({
providedIn: 'root'
})
export class ElectronService {
ipcRenderer: typeof ipcRenderer;
webFrame: typeof webFrame;
remote: typeof remote;
childProcess: typeof childProcess;
fs: typeof fs;
get isElectron(): boolean {
return !!(window && window.process && window.process.type);
}
constructor() {
// Conditional imports
if (this.isElectron) {
this.ipcRenderer = window.require('electron').ipcRenderer;
this.webFrame = window.require('electron').webFrame;
this.childProcess = window.require('child_process');
this.fs = window.require('fs');
// If you want to use a NodeJS 3rd party deps in Renderer process (like #electron/remote),
// it must be declared in dependencies of both package.json (in root and app folders)
// If you want to use remote object in renderer process, please set enableRemoteModule to true in main.ts
this.remote = window.require('#electron/remote');
}
}
And then in the Electron backend I first added "#prisma/client": "^3.0.1" to the package.json (for the Electron backend not the frontend). Then I added to the main.ts this function to handle the requests from the renderer:
// main.ts
ipcMain.handle("prisma-channel", async (event, args) => {
const prisma = new PrismaClient();
await prisma.testTable.create(
{
data: {
value: args
}
}
);
const readValue = await prisma.testTable.findMany();
return JSON.stringify(readValue);
})
This way of simply adding the IPC Main handler in the main.ts file of course is a big code smell but usefull as minimal example. I think I will move on with the achitecture concept presented in this article.

Firebase Firestore rules path in production

Following the docs, I tried to debug the path
When I'm using the emulator locally, the path is like /databases/$(database)/documents/col1/doc1/col2/doc2/...
so this resource['__name__'][0] will be databases
and resource['__name__'][4] will be doc1
for example, this test function will return true (locally, using the emulator):
function test() {
return string(resource['__name__'][4]).matches('doc1');
}
Or:
function test() {
return string(resource['__name__'][0]).matches('databases');
}
But, when I deploy the rules, the test function becomes false.
Is it because the path in production is different? If so, how can I see it? is there any example for a production path..

Angular 2 submodule routing with AOT and Rollup

What is the correct way to configure a submodule in Angular 2 so that it will work after AOT and rollup? I'm not concerned about lazy loading and will be happy for all submodules to be bundled together, but loadChildren is the cleanest way of referencing a submodule and having it use the correct <router-outlet>, and although I've tried various methods, none will work in both development and production.
I followed the instructions in the AOT cookbook to prepare my app for deployment. My module structure is root > account > admin, and I want the admin routes to load in the outlet defined by the account component.
Here's the router config for the account module:
import { NgModule } from '#angular/core';
import { AdminModule } from './admin/admin.module';
const accountRoutes: Routes = [{
path: 'account',
component: AccountComponent,
children: [
{ path: 'setup', component: SetupComponent },
{ path: 'admin', loadChildren: () => AdminModule }
]
}];
This works in development, but NGC compilation fails with Error: Error encountered resolving symbol values statically. Reference to a local (non-exported) symbol 'accountRoutes'.
I added an export to accountRoutes but this also fails to compile with Error: Error encountered resolving symbol values statically. Function calls are not supported..
One suggestion was to use a string instead of a function so that NGC can compile the code:
loadChildren: 'app/account/admin/admin.module#AdminModule'
This works in development and compiles successfully but the compiled app will not run, because SystemJS is unavailable. The error is:
ERROR Error: Uncaught (in promise): TypeError: System.import is not a function
TypeError: System.import is not a function
at t.loadFactory (build.js:6)
at t.load (build.js:6)
at t.loadModuleFactory (build.js:12)
I tried including SystemJS in the production build but it could not find the separate module file app/account/admin/admin.module.ngfactory - after rollup everything is in one build.js. There may be a way to have separate rollups build each submodule, but that's a lot of work.
I found the suggestion of referring to an exported function:
export function loadAdminModule() {
return AdminModule;
}
loadChildren: loadAdminModule
This works in development, but in production the runtime compiler is unavailable so it logs build.js:1 ERROR Error: Uncaught (in promise): Error: Runtime compiler is not loaded.
The helper function can be modified so it works in production by referencing the NgFactory instead, but this won't work in development.
import { AdminModuleNgFactory } from '../../../aot/src/app/account/admin/admin.module.ngfactory';
export function loadAdminModule() {
return AdminModuleNgFactory;
}
loadChildren: loadAdminModule
Is there a supported way of using loadChildren so that it will work with the instructions in the AOT cookbook? Is it better to use webpack instead?
For anyone having the same problem, here's the temporary workaround I'm using. Hopefully a better answer will come along soon.
I now define my submodules in a dedicated file, submodules.ts. I have two versions of this file, submodules-jit.ts for development, and submodules-aot.ts for AOT/rollup deployment to production. I gitignore submodules.ts and have added commands to my npm start and npm build:aot scripts that substitute in the right one.
// submodules-jit.ts
import { AdminModule } from './account/admin/admin.module'
export function adminModule(): any { return AdminModule; }
// submodules-aot.ts
import { AdminModuleNgFactory } from '../../aot/src/app/account/admin/admin.module.ngfactory'
import { AdminModule } from './account/admin/admin.module'
export function adminModule(): any { return AdminModuleNgFactory; }
export function adminModuleKeep(): any { return AdminModule; }
AdminModule must be referenced in the AOT file so it is preserved.
Then I use the adminModule function in my routes:
import { adminModule } from '../submodules'
{ path: 'admin', loadChildren: adminModule }
Finally for convenience, npm scripts drop in the correct file.
"build:aot": "cp src/submodules-aot.ts src/app/submodules.ts && ngc -p tsconfig-aot.json && rollup -c rollup-config.js",
"start": "cp src/submodules-jit.ts src/app/submodules.ts && concurrently \"npm run build:watch\" \"npm run serve\""
Needless to say this is a horrible hack, but it keeps the routing files clean and most of the evil in one place.
In my case I solved with your hint on loading routes from an external module, then reading this issue on angular/cli issue page.
So, I came out refactoring my routes from:
import { TestModule } from './pathToModule/test.module';
export function exportTestModule() {
return TestModule;
}
. . .
{
path: 'test',
loadChildren: exportTestModule
}
To:
. . .
{
path: 'test',
loadChildren: './pathToModule#TestModule'
}

Meteor : "console" variable undefined inside require call

I'm facing a strange problem on Meteor and I can't resolve it :
I'm developping a WebRTC app using Meteor, PeerJS and AdapterJS (which give an WebRTC plugin for unsupported browser like Safari or IE). These two libs are downloaded using NPM : meteor npm install peerjs/adapterjs
So in my view's controller I have :
view.js
//import Peer from 'peerjs'; => same error with "import"
//import AdapterJS from 'adapterjs';
Template.view.onRendered(function(){
AdapterJS = require("adapterjs");
Peer = require("peerjs");
//var peerkey="..."
var peer = new Peer({
key: peerkey, // get a free key at http://peerjs.com/peerserver
debug: 3,
config: {'iceServers': [
{ url: 'stun:stun.l.google.com:19302' },
{ url: 'stun:stun1.l.google.com:19302' },
]}
});
But when I run my controller, I get an exception because "console" is undefined inside peerjs/util.js function when calling the peerjs constructor :
Uncaught TypeError: Cannot read property 'log' of undefined
Strangly, when I only require "peerjs", there is no exeption...
I tried to change the order of require functions but it won't work.
Other variable like "alert", "window.console" work and are defined inside the module but "console" not.. :/
Any suggestion can help ^^
Thanks in advance.
EDIT : If I add a breakpoint on the first line of node_module/peerjs/lib/util.js, I see that the "console" variable is "undefined" inside util.js but .... it is defined inside the caller function (fileEvaluate) !
EDIT2 : I tried something else to check if the code inside adapterjs redefine or change something : I put 'require("adapterjs")' inside a timeout function with a long delay (10 seconds) and .... console is still undefined inside peer module ! But when I comment require("adapterjs"), no error, console is defined ! I think that Meteor do something special before running the controller script depending on require functions...
EDIT3 : Here is a git repo to test the project : gitlab.com
If you display the dev console, you will see exceptions.
I found a solution, although I don't fully understand it myself. It's something to do with the way that Meteor imports modules, and Peerjs runs afoul of that.
Basically I copied node_modules/peerjs/dist/peer.js into the client directory, so that Meteor will load it "as is".
A small change to main.js as follows:
import './main.html';
// I placed peer.js from the node_modules/peerjs/dist/peer.js into the client folder and it works fine
// import {Peer} from 'peerjs';
import {AdapterJS as Adapter} from 'adapterjs';
Template.hello.onCreated(function helloOnCreated() {
// counter starts at 0
window.peer = new Peer({
and it works fine :)
I see in line 860+ of adapter.js that console is being defined (part of the shim) from https://github.com/Temasys/AdapterJS/blob/master/source/adapter.js
// IE 9 is not offering an implementation of console.log until you open a console
if (typeof console !== 'object' || typeof console.log !== 'function') {
/* jshint -W020 */
console = {} || console;
// Implemented based on console specs from MDN
// You may override these functions
console.log = function (arg) {};
console.info = function (arg) {};
console.error = function (arg) {};
This code defines the console if it doesn't find one to it's liking? Does this mean you are using IE9 or some other incompatible browser?
Try pasting this into your console and see what it tells you:
if (typeof console !== 'object' || typeof console.log !== 'function') alert("Console not present, needs to be shimmed?"); else console.log("console is ok");
Presumably the reason you are using adapter.js is for compatibility purposes - this will help you trouble shoot it. Please let me know what you find, as I will be following you down this path :)

Resources