Does webdriverio support folder upload? - automated-tests

I am trying to upload a folder. I am using the below code from wdio official documentation.
const path = require('path');
const filePath = path.join(__dirname, 'path/to/your/file');
const remoteFilePath = browser.uploadFile(filePath);
$('.upload-data-file-input').setValue(remoteFilePath);
If I remove the third line, it works fine.

Related

trouble using next-transpile-modules in #nrwl/nextjs monorepo

My understanding is that I fall into Group 1 as those who are;
running a [nextjs] monorepo and therefore they want to be able to import their other packages from node_modules.
And running into an error similar to this:
../../node_modules/#waweb/base-ui.theme.brand-definition/dist/brand-definition.module.scss
CSS Modules cannot be imported from within node_modules. Read more:
https://nextjs.org/docs/messages/css-modules-npm Location:
../../node_modules/#waweb/base-ui.theme.brand-definition/dist/index.js
The official solution is next-transpile-modules, but as soon as I add any packages to the list of modules, I start getting errors in CSS modules in local source.
../../libs/ui/src/lib/contact.module.css
CSS Modules cannot be imported from within node_modules.
Read more: https://nextjs.org/docs/messages/css-modules-npm
Location: ../../libs/ui/src/lib/learn-more.tsx
Import trace for requested module:
../../libs/ui/src/lib/learn-more.tsx
../../libs/ui/src/lib/home.tsx
./pages/index.tsx
This is repeated for all components that were previously working.
I have prepared a branch in a public repo that has a full ci/cd and gitpod dev env configured that demonstrates the critical change.
Let's assume the sources to the components I am attempting to transpile are located in the correct node_modules dir, and I am using the following next config:
// eslint-disable-next-line #typescript-eslint/no-var-requires
const withNx = require('#nrwl/next/plugins/with-nx');
const withPlugins = require('next-compose-plugins');
const withTM = require('next-transpile-modules')(
[
'#waweb/base-ui.theme.colors',
'#waweb/base-ui.theme.color-definition',
'#waweb/base-ui.theme.size-definition',
'#waweb/base-ui.theme.shadow-definition',
'#waweb/base-ui.theme.brand-definition',
'#waweb/base-ui.theme.theme-provider',
],
{ debug: true }
);
const withPWA = require('next-pwa');
/**
* #type {import('#nrwl/next/plugins/with-nx').WithNxOptions}
**/
const nextConfig = {
nx: {
// Set this to true if you would like to to use SVGR
// See: https://github.com/gregberge/svgr
svgr: true,
},
images: {
domains: [
'www.datocms-assets.com',
'a.storyblok.com',
'images.ctfassets.net',
'images.prismic.io',
'cdn.aglty.io',
'localhost', // For Strapi
],
imageSizes: [24, 64, 300],
},
};
const pwaConfig = {};
const plugins = [[withNx], [withPWA, pwaConfig]];
module.exports = withTM(withPlugins([...plugins], nextConfig));
Any idea what's wrong with my setup here?
Thank you all for any thoughts as to what I'm doing wrong here.
Cheers!
edit
For some additional context, I have tried many different variations, and the one I ended up on (shown above) is what got the module transpilation to actually work, according to the debug statements. Only now do I have the reported errors in modules that are actually source components, not node_modules. The usage of the plugin at all seems to break unrelated functionality.
It looks odd to me that you are wrapping withPuglins inside of withTM...
withTM is a plugin so I would imagine it should be more this format:
module.exports = withPlugins([
withTM
], nextConfig);
This seems to be what's expected when looking at the docs:
https://www.npmjs.com/package/next-transpile-modules
https://www.npmjs.com/package/next-compose-plugins

Where should I store JSON file and fetch data in Next.JS whenever I need?

Project:
I am working on an E-commerce application and it has more than 1,600 products and 156 categories.
Problem:
Initially, on the first product page, 30 products will be fetched (due to the page limitation), but on the left sidebar, I need filters that will be decided on the basis of tags of all 1,600 products. So that's why I need all the products in the first fetch and then I will extract common tags by looping over all the products and immediately show them on the sidebar.
What do I want?
I am not sure but I think it would be the best solution if I generate a JSON file containing all the products and store it somewhere, where I can fetch just hitting the URL using REST API in Next JS (either in getServerSideProps or getStaticProps).
Caveat:
I tried by storing JSON file in ./public directory in next js application, it worked in localhost but not in vercel.
Here is the code I wrote for storing JSON file in ./public directory:
fs.writeFileSync("./public/products.json", JSON.stringify(products, null, 2)); //all 1,600 products
One solution it to fetch it directly from front-end (if the file is not too big) otherwise, for reading the file in getServerSideProps you will need a custom webpack configuration.
//next.config.js
const path = require("path")
const CopyPlugin = require("copy-webpack-plugin")
module.exports = {
target: "serverless",
future: {
webpack5: true,
},
webpack: function (config, { dev, isServer }) {
// Fixes npm packages that depend on `fs` module
if (!isServer) {
config.resolve.fallback.fs = false
}
// copy files you're interested in
if (!dev) {
config.plugins.push(
new CopyPlugin({
patterns: [{ from: "content", to: "content" }],
})
)
}
return config
},
}
Then you can create a utility function to get the file:
export async function getStaticFile(file) {
let basePath = process.cwd()
if (process.env.NODE_ENV === "production") {
basePath = path.join(process.cwd(), ".next/server/chunks")
}
const filePath = path.join(basePath, `file`)
const fileContent = await fs.readFile(filePath, "utf8")
return fileContent
}
There is an open issue regarding this:
Next.js API routes (and pages) should support reading files

How to serve next js app using keystonejs?

I have a problem with serving next js app using keystonejs. I want to achive something similar like in to do nuxt example which you can choose while creating keystone project. I used code from this link https://www.keystonejs.com/keystonejs/app-next/ in index.js file, but I get an error while trying to run the app:
ReferenceError: distDir is not defined at Object.
What should I do?
this is what I have in my setup
create app folder (next js code)
create next.config.js in app folder
my next.js.config file has following code
const distDir = 'dist';
module.exports = {
distDir: `../${distDir}/www`,
env: {
SERVER_URL: process.env.SERVER_URL || 'http://localhost:4000',
},
publicRuntimeConfig: {
// Will be available on both server and client
// staticFolder: '/static',
},
};
it export the build artifacts in dist/www in root of keystone project.

INVALID_ARGUMENT: Location 'europe-west1' is not a valid location

My error code after calling the below (which is shortened for brevity) within a cloud functions is:
Error: 3 INVALID_ARGUMENT: Location 'europe-west1' is not a valid location. Use ListLocations to list valid locations.
If I change to location to "us-central" for example, the error code changes to:
Error: 3 INVALID_ARGUMENT: Location must equal europe-west1 because the App Engine app that is associated with this project is located in europe-west1
I have had a look on stackoverflow for similar but came up short. I left a comment on this question to see if the op had any luck:
Google Cloud Tasks: Location 'europe-west1' is not a valid location
Where am I going wrong?
Thanks!
JT
OS: Google Cloud Functions
Node.js version: 12
npm version: 6.14.10
#google-cloud/tasks version: 2.3.0
Steps to reproduce
const functions = require('firebase-functions');
const {CloudTasksClient} = require("#google-cloud/tasks");
const admin = require('firebase-admin');
const client = new CloudTasksClient({ fallback: true });
// Omit the actual triggered function
async function createHttpTasks(session) {
const project = "XXXXX"; // These match my project and queue id/name
const queue = "XXXXXX";
const location = "europe-west1";
// Construct the fully qualified queue name.
const parent = client.queuePath(project, location, queue);
// Do stuff
const requestCheck = {parent, taskCheckIn};
await client.createTask(requestCheck);
}
I had two things wrong:
const client = new CloudTasksClient({ fallback: true });
should have been:
const client = new CloudTasksClient();
and
const requestCheck = {parent, taskCheckIn};
should have been
const requestCheck = {parent, task: taskCheckIn};
thanks to the gcloud team for responding to my issue:
https://github.com/googleapis/nodejs-tasks/issues/509
If anyone else is getting this error, make sure there are no spaces in the location variable. For example, there is a space at the end of the variable:
location="europe-west-3 "

gulp sass: merge folders with file inheritance

I have a folder structure like this (simplified):
|-project1
|--_file1.scss
|--_file2.scss
|--_file3.scss
|
|-project2
|--_file2.scss
|
|-css
|--project1.css
|--project2.css
I am looking for a way to compile the sass files with inheritance. The idea behind this is, that I have a base project (project1) and project 2 only contains those files that need to be changed.
So upon compilation gulp should render 2 css files:
project1.css
This contains only the files from project1/scss/ folder
project2.css
This one should contain file1 and file3 from project 1 and file2 from project 2.
Is this possible? What modules would be needed?
Thank you
Here is something that should work for you. I note that you have all and only partials in your project folders, i.e. _file1.scss, _file2.scss, etc. You will have to have at least one file that is not a partial that imports those partials for sass to work.
const gulp = require('gulp');
const fs = require('fs');
const path = require('path');
const filter = require('gulp-filter');
const sass = require('gulp-sass');
const concat = require('gulp-concat');
const addsrc = require('gulp-add-src');
// const glob = require("glob");
const sources = ['project1', 'project2', 'project3'];
// could glob your sourceFolders with something like
// const sources = glob.sync("project*");
const filterSources = Object.keys(sources);
function isUnique(file, index) {
console.log(path.basename(file.path)); // file1.scss, file2.scss, etc. all from project1
baseName = path.basename(file.path);
folder = sources[index]; // project
// does that basename exist in thecurrent project (sources[index] )
return !fs.existsSync(folder + path.sep + baseName);
}
gulp.task('default', function () {
// loop through all the project folders
filterSources.forEach(function (project, index) {
const f = filter(function (file) {
return isUnique(file, index);
});
// always using project1 files as basis
const stream = gulp.src('./project1/*.scss')
// filter out from the source stream (project1 files) any files that appear in the current project directory
.pipe(f)
// add all files from the current project directory, i.e., project2, project3, etc.
.pipe(addsrc.append('./' + sources[index] + '/*.scss'))
.pipe(sass().on('error', sass.logError))
// give the concat the filename of the current project
.pipe(concat(sources[index] + '.css'))
.pipe(gulp.dest('css'));
return stream;
});
This works for any number of project folders, just make sure that your non-partial scss file that imports the others is not one that is filtered out.
But it looks like you don't really want any partials anyway since you want to concat all files in each project so remove those leading underscores from each file name.

Resources