Setup
I have a monorepo setup with the following file structure:
├── functions
│ ├── src
│ └── package.json
├── shared
| ├── dist
| ├── src
| └── package.json
├── frontend
| └── ...
└── firebase.json
Approach 1 (failed)
./shared is holding TypeScript classes shared among the ./backend and ./frontend. Ideally, I want to reference the shared lib from the functions/package.json using a symlink to avoid that I have to re-install after every change to my shared code (where most of the functionality resides).
However, this does not work (neither using link:, nor an absolute file: path, nor an relative file: path)
// functions/package.json
...
"dependencies": {
"shared": "file:/home/boern/Desktop/wd/monorepo/shared"
...
}
resulting into an error upon firebase deploy --only functions (error Package "shared" refers to a non-existing file '"home/boern/Desktop/wd/monorepo/shared"'). The library (despite being present in ./functions/node_modules/) was not transferred to the server?
Approach 2 (failed)
Also, setting "functions": {"ignore": []} in firebase.json did not help.
Approach 4 (works, but lacks requirement a) see Goal)
The only thing that DID work, was a proposal by adevine on Github:
// functions/package.json
...
"scripts": {
...
"preinstall": "if [ -d ../shared ]; then npm pack ../shared; fi"
},
"dependencies": {
"shared": "file:./bbshared-1.0.0.tgz"
...
}
Goal
Can someone point out a way to reference a local library in a way that a) ./functions always uses an up-to-date version during development and b) deployment using the stock Firebase CLI succeeds (and not, e.g. using firelink)? Or is this simply not supported yet?
Here's my workaround to make approach 4 work:
rm -rf ./node_modules
yarn cache clean # THIS IS IMPORTANT
yarn install
Run this from the ./functions folder
Related
Is there a way to have a Firebase/Google cloud function with this kind of architecture with cli command (firebase deploy --only functions) ?
Expected:
.
└── functions/
├── function_using_axios/
│ ├── node_modules/
│ ├── package.json
│ └── index.js
└── function_using_moment/
├── node_modules/
├── package.json
└── index.js
Currently, my archi look like this:
.
└── functions/
├── node_modules/
├── package.json
├── index.js
├── function_using_axios.js
└── function_using_moment.js
The fact is, i have a lot of useless packages dependencies for some functions.
And it increase cold start time.
I know this is possible with the web UI.
WEB UI Exemple:
List
One package for one Function
My Current Archi see on WEB UI, one Package for all functions:
Any idea ?
Thanks.
When deploying through Firebase there can only be a single index.js file, although gcloud may any different in this respect.
To ensure you only load the dependencies that each function needs, move the require for each dependency into the functions that need it:
exports.usageStats = functions.https.onRequest((request, response) => {
const module = require('your-dependency');
// ...
});
Also see:
the Firebase documentation on organizing functions, which shows a way to have the functions over multiple files (although you'll still need to import/export them all in index.js).
I have been working through the guidance available at:
https://www.drupal.org/docs/creating-custom-modules
I added my custom module to my composer.json, like so:
composer config repositories.mygit \
'{ "type": "vcs",
"url": "git#git.mydomain.com:cf_supporters_for_drupal.git",
"ssh2": { "username": "git",
"privkey_file": "/var/lib/jenkins/.ssh/id_rsa",
"pubkey_file": "/var/lib/jenkins/.ssh/id_rsa.pub" } }'
composer require ymd/cf_supporters_for_drupal
in the path with my composer.json file, I run:
drupal$ find . -name cf_supporters_for_drupal
./vendor/ymd/cf_supporters_for_drupal
browsing to it, using git status and git log I have determined that I have the newest version installed.
And yet, I see no evidence in the /admin/modules path that the module is available to me. I'm curious about how I might begin to debug this issue. Can anyone provide any guidance beyond what I already see at: https://www.drupal.org/docs/creating-custom-modules/let-drupal-know-about-your-module-with-an-infoyml-file#debugging ???
~/sandbox/cf_supporters_for_drupal $ cat cf_supporters_for_drupal.info.yml
name: CF Supporters for Drupal Module
description: Exposes the cf_supporters_mojo application on a drupal web site.
package: Custom
type: module
version: 1.0
core: 8.x
configure: cf_supporters_for_drupal.settings
~/sandbox/cf_supporters_for_drupal $ cat composer.json
{
"name": "ymd/cf_supporters_for_drupal",
"description": "A drupal module to expose cf_supporters_mojo",
"type": "module",
"license": "GPL-2.0-or-later"
}
~/sandbox/cf_supporters_for_drupal $ tree .
.
├── cf_supporters_for_drupal.info.yml
├── cf_supporters_for_drupal.links.menu.yml
├── cf_supporters_for_drupal.routing.yml
├── composer.json
├── LICENSE
├── README.md
└── src
└── Controller
└── CFSupportersForDrupalController.php
2 directories, 7 files
UPDATE #1:
2pha, in a comment below, suggests I need to put this code in a modules folder, rather than simply in the vendors folder. My questions back in 2pha's direction are:
I assume I want to put it perhaps in web/modules/custom ??? Is that right? How is it, using the composer config cli tool (I need to script this as much as possible), would I make that happen?
Yes, thank you, 2pha!
Late last night, I found documented here:
https://github.com/composer/installers
that composer has built in support for fourteen drupal-specific 'types', including: 'drupal-custom-module'. I have not (yet) found a way using composer config to manipulate extra.installer-paths hash in composer.json. But doing so manually as you suggest above, resulted in exposing my module on the /admin/modules and the /admin/config pages. For the moment that is close enough that I have now turned my attention to creating a configuration page for my module.
I have developed a webapp based on Symfony3.4. On production it is deployed on a Ubuntu 18.04 Server via deployer (deployer.org).
Everything runs fine so far. The webapp is deployed in /opt/app/prod done by a user that is part of group www-data.
My webapp allows the upload of files. To support this I have added the folder data which stores the uploaded files.
In order to sustain access to the files after another release I have added the data folder to the list of shared folders.
My deploy.php looks as follows:
set('bin_dir', 'bin');
// Symfony console bin
set('bin/console', function () {
return sprintf('{{release_path}}/%s/console', trim(get('bin_dir'), '/'));
});
// Project name
set('application', 'appname');
set('http_user', 'www-data');
set('writable_mode', 'acl');
// Project repository
set('repository', '<MY_GITREPO>');
// [Optional] Allocate tty for git clone. Default value is false.
set('git_tty', true);
// Shared files/dirs between deploys
add('shared_files', []);
add('shared_dirs', ['data']);
// Writable dirs by web server
add('writable_dirs', ['{{release_path}}','data']);
// Hosts
host('prod')
->hostname('<MY_HOST>')
->user('<MY_USER>')
->stage('prod')
->set('deploy_path', '/opt/app/prod/<MY_APPNAME>');
This leads to the following folder structure:
.
├── current -> releases/5
├── releases
│ ├── 2
│ ├── 3
│ ├── 4
│ └── 5
└── shared
├── app
└── data
So everything fine so far - with one exception:
Deployer wants to setfacl the data folder which is not allowed as the files in data belongs to www-data:www-data where deployer tries to change this as .
The command "export SYMFONY_ENV='prod'; cd /opt/app/prod/<MY_APPNAME>/releases/5 && (setfacl -RL -m u:"www-data":rwX -m u:`whoami`:rwX /opt/app/prod/<MY_APPNAME>/releases/5)" failed.
setfacl: /opt/app/prod/<MY_APPNAME>/releases/5/data/child/679/ba7f9641061879554e5cafbd6a3a557b.jpeg: Operation not permitted
I have the impression that I did a mistake in my deployer.php or I missed something.
Has someone an idea what I need to do in order to get my deployment running?
Thanks and best regards
While testing out conan, I had to "pip install" it.
As I am working in a fully offline environment, my expectation was that I could simply
Manually deploy all dependencies listed in https://github.com/conan-io/conan/blob/master/conans/requirements.txt to a local Pypi repository called myrepo-python
Install conan with
pip3 install --index http://myserver/artifactory/api/pypi/myrepo-python/simple conan
This works fine for some packages and then fails for the dependency on patch == 1.16
[...]
Collecting patch==1.16 (from conan)
Could not find a version that satisfies the requirement patch==1.16 (from conan) (from versions: )
No matching distribution found for patch==1.16 (from conan)
Looking into the Artifactory logs, this shows that even though I manually deployed patch-1.16.zip (from https://pypi.org/project/patch/1.16/#files) into the repository, it is not present in the index...
The .pypi/simple.html file doesn't contain an entry for 'patch' (checked from the Artifactory UI)
The server logs ($ARTIFACTORY_HOME/logs/artifactory.log) show the file being added to the indexing queue but don't contain a line saying that it got indexed
Does anyone know why patch-1.16.zip is not indexed by Artifactory?
This is on Artifactory 5.8.4.
For now, my only workaround is to gather all the dependencies into a local path and point pip3 at it
scp conan-1.4.4.tar.gz installserver:/tmp/pip_cache
[...]
scp patch-1.16.zip installserver:/tmp/pip_cache
[...]
scp pyparsing-2.2.0-py2.py3-none-any.whl installserver:/tmp/pip_cache
ssh installserver
installserver$ pip3 install --no-index --find-links="/tmp/pip_cache" conan
The reason you are unable to install the "patch" Pypi package via Artifactory is that it does not comply with the Python spec.
Based on Python spec (https://www.python.org/dev/peps/pep-0314/ and https://packaging.python.org/tutorials/packaging-projects/), the structure of a Python package should be, for example:
└── patch-1.16.zip
└── patch-1.16
├── PKG-INFO
├── __main__.py
├── patch.py
└── setup.py
However, the zip archive (can be found here https://pypi.org/project/patch/1.16/#files) is structured like that:
└── patch-1.16.zip
├── PKG-INFO
├── __main__.py
├── patch.py
└── setup.py
Artifactory is searching for the metadata file (PKG-INFO in this case) in a certain pattern (inside any folder). Since the PKG-INFO is in the root of the zip (and not in a folder), it cannot find it, therefore, this package's metadata will not be calculated and it will not appear in the "simple" index file (see the error in artifactory.log). As a result, you are unable to install it with pip.
Workaround:
What you can do is manually changing the structure to the correct one.
Create a folder named patch-1.16 and extract the zip to it. Then, zip the whole folder, so you will get the structure like the example above. Finally, deploy this zip to Artifactory.
This time, the PKG-INFO file will be found, the metadata will be calculated and pip will be able to install it.
EDIT: Meteor 1.3 release is out and a npm package is about to be released allowing a direct use of CSS modules without Webpack
I would like to use https://github.com/gajus/react-css-modules in Meteor 1.3 via NPM. But the readme says to use Webpack. I never used Webpack as it seems to me to do the same build job as Meteor.
So do you know a way, in this specific case, for using React Module CSS in Meteor 1.3 beta?
There is actually package for this. MDG is also considering bring webpacks on meteor core at some stage. And yes it is build tool. Just more modular and faster than current one. Its also pretty complex as build tools go, at least in my opinion.
To have webpacks in meteor just >
meteor add webpack:webpack
meteor remove ecmascript
You need to remove ecmascripts as you get them from webpack as well and having 2 installs can cause errors.
For much more complete answer check Sam Corcos blog post and also Ben Strahan's comment for Meteor 1.3 Beta. I used it as tutorial to get different webpack package up.
https://medium.com/#SamCorcos/meteor-webpack-from-the-ground-up-f123288c7b75#.phcq5lvm8
For package you mentioned I think webpack.packages.json should look something like this
{
"private": true,
"scripts": {
"start": "webpack-dev-server"
},
"devDependencies": {
"babel-core": "^6.4.5",
"babel-loader": "^6.2.1",
"babel-preset-es2015": "^6.3.13",
"babel-preset-react": "^6.3.13",
"babel-preset-stage-0": "^6.3.13",
"css-loader": "^0.23.1",
"extract-text-webpack-plugin": "^1.0.1",
"style-loader": "^0.13.0",
"webpack": "^2.0.6-beta",
"webpack-dev-server": "^2.0.0-beta"
},
"dependencies": {
"react": "^0.15.0-alpha.1",
"react-css-modules": "^3.7.4",
"react-dom": "^0.15.0-alpha.1"
}
And webpack.config.js you could copy directly from
https://github.com/gajus/react-css-modules-examples/blob/master/webpack.config.js
Meteor v1.3.2 introduced built-in import functionality for .css files (as well as other CSS preprocessor files, such as less and sass) from within .js and .jsx.
For example, given the following (simplified) folder structure,
.
├── client
│ └── main.js
├── imports
│ └── client
│ ├── main.css
│ └── main.jsx
├── node_modules
│ └── some-module
│ └── dist
│ └── css
│ └── main.css
├── package.json
└── server
└── main.js
where some-module is an npm module installed using:
$ meteor npm install --save some-module
importing local and module stylesheets in imports/client/main.jsx:
// importing a style file from a node module
import 'some-module/dist/css/main.css';
// importing a style from a local file
import './main.css';
You can start from scratch like this.
Start from scratch
meteor create test-project
cd test-project
npm init
meteor remove ecmascript
meteor add webpack:webpack
meteor add webpack:react
meteor add webpack:less
meteor add react-runtime # Skip this step if you want to use the NPM version
meteor add react-meteor-data
meteor
npm install
meteor
Entry files
Your entry files are defined within your package.json. The main is your server entry and the browser is your client entry.
{
"name": "test-project",
"private": true,
"main": "server/entry.js",
"browser": "client/entry.js"
}
For more info please check this link