I have been working through the guidance available at:
https://www.drupal.org/docs/creating-custom-modules
I added my custom module to my composer.json, like so:
composer config repositories.mygit \
'{ "type": "vcs",
"url": "git#git.mydomain.com:cf_supporters_for_drupal.git",
"ssh2": { "username": "git",
"privkey_file": "/var/lib/jenkins/.ssh/id_rsa",
"pubkey_file": "/var/lib/jenkins/.ssh/id_rsa.pub" } }'
composer require ymd/cf_supporters_for_drupal
in the path with my composer.json file, I run:
drupal$ find . -name cf_supporters_for_drupal
./vendor/ymd/cf_supporters_for_drupal
browsing to it, using git status and git log I have determined that I have the newest version installed.
And yet, I see no evidence in the /admin/modules path that the module is available to me. I'm curious about how I might begin to debug this issue. Can anyone provide any guidance beyond what I already see at: https://www.drupal.org/docs/creating-custom-modules/let-drupal-know-about-your-module-with-an-infoyml-file#debugging ???
~/sandbox/cf_supporters_for_drupal $ cat cf_supporters_for_drupal.info.yml
name: CF Supporters for Drupal Module
description: Exposes the cf_supporters_mojo application on a drupal web site.
package: Custom
type: module
version: 1.0
core: 8.x
configure: cf_supporters_for_drupal.settings
~/sandbox/cf_supporters_for_drupal $ cat composer.json
{
"name": "ymd/cf_supporters_for_drupal",
"description": "A drupal module to expose cf_supporters_mojo",
"type": "module",
"license": "GPL-2.0-or-later"
}
~/sandbox/cf_supporters_for_drupal $ tree .
.
├── cf_supporters_for_drupal.info.yml
├── cf_supporters_for_drupal.links.menu.yml
├── cf_supporters_for_drupal.routing.yml
├── composer.json
├── LICENSE
├── README.md
└── src
└── Controller
└── CFSupportersForDrupalController.php
2 directories, 7 files
UPDATE #1:
2pha, in a comment below, suggests I need to put this code in a modules folder, rather than simply in the vendors folder. My questions back in 2pha's direction are:
I assume I want to put it perhaps in web/modules/custom ??? Is that right? How is it, using the composer config cli tool (I need to script this as much as possible), would I make that happen?
Yes, thank you, 2pha!
Late last night, I found documented here:
https://github.com/composer/installers
that composer has built in support for fourteen drupal-specific 'types', including: 'drupal-custom-module'. I have not (yet) found a way using composer config to manipulate extra.installer-paths hash in composer.json. But doing so manually as you suggest above, resulted in exposing my module on the /admin/modules and the /admin/config pages. For the moment that is close enough that I have now turned my attention to creating a configuration page for my module.
Related
In a create-next-app Next.js application, I want to move the pages folder in the root directory to a src folder. I added a jsconfig.json with the code (below), however now I get the error message "404 | This page could not be found." Anyone have any insight? (Sorry beginner to web development)
{
"compilerOptions": {
"baseUrl": "src"
}
}
Nextjs by default supports moving /pages to src/ folder.
Create a /src folder in the root directory.
Delete the /.next folder
Move /pages to the /src folder
Remember package.json, .gitignore and other config files needs to be in the root directory and should not be moved to the /src folder.
Once that is done just run the command $ npm run dev or $ yarn dev , so you can view it on your localhost.
More: https://nextjs.org/docs/advanced-features/src-directory
In case you are using NextJS + TailwindCSS, you need to change the following in tailwind.config.js after moving files under the src directory:
module.exports = {
content: [
'./pages/**/*.{js,ts,jsx,tsx}',
'./components/**/*.{js,ts,jsx,tsx}',
],
...
...
You need to stop the server and then do npm run dev. That solved my problem when I moved things into the src directory and started getting 404 pages.
content: [
'./src/pages/**/*.{js,ts,jsx,tsx}',
'./src/components/**/*.{js,ts,jsx,tsx}',
],
replace above in case of src
As #Thierry mentioned in the comments, according to the docs "Pages can also be added under src/pages as an alternative to the root pages directory. The src directory is very common in many apps and Next.js supports it by default."
So, src/pages will be ignored if pages is present in the root directory.
More at the official docs: https://nextjs.org/docs/advanced-features/src-directory
src/pages will be ignored if pages is present in the root directory
Update tsconfig.json (if you use Typescript)
"paths": {
"#/*": ["./src/*"]
}
Reload npm run dev
Setup
I have a monorepo setup with the following file structure:
├── functions
│ ├── src
│ └── package.json
├── shared
| ├── dist
| ├── src
| └── package.json
├── frontend
| └── ...
└── firebase.json
Approach 1 (failed)
./shared is holding TypeScript classes shared among the ./backend and ./frontend. Ideally, I want to reference the shared lib from the functions/package.json using a symlink to avoid that I have to re-install after every change to my shared code (where most of the functionality resides).
However, this does not work (neither using link:, nor an absolute file: path, nor an relative file: path)
// functions/package.json
...
"dependencies": {
"shared": "file:/home/boern/Desktop/wd/monorepo/shared"
...
}
resulting into an error upon firebase deploy --only functions (error Package "shared" refers to a non-existing file '"home/boern/Desktop/wd/monorepo/shared"'). The library (despite being present in ./functions/node_modules/) was not transferred to the server?
Approach 2 (failed)
Also, setting "functions": {"ignore": []} in firebase.json did not help.
Approach 4 (works, but lacks requirement a) see Goal)
The only thing that DID work, was a proposal by adevine on Github:
// functions/package.json
...
"scripts": {
...
"preinstall": "if [ -d ../shared ]; then npm pack ../shared; fi"
},
"dependencies": {
"shared": "file:./bbshared-1.0.0.tgz"
...
}
Goal
Can someone point out a way to reference a local library in a way that a) ./functions always uses an up-to-date version during development and b) deployment using the stock Firebase CLI succeeds (and not, e.g. using firelink)? Or is this simply not supported yet?
Here's my workaround to make approach 4 work:
rm -rf ./node_modules
yarn cache clean # THIS IS IMPORTANT
yarn install
Run this from the ./functions folder
Im trying to setup a pleasant way of working with wordpress and its plugins using composer. My question will be quite broad. How would you do it?
What i want is basically so it installs wordpress (which is now doing), but the plugins i specify get installed in a folder named "vendor" and not in the "plugins" folder. Why is that?
Here is my composer.json.
{
"name": "name",
"description": "name Wordpress",
"repositories":[
{
"type":"composer",
"url":"https://wpackagist.org"
}
],
"require": {
"timber/timber": "^1.3",
"johnpbloch/wordpress-core-installer": "^0.2.1",
"johnpbloch/wordpress": "^4.4"
},
"extra": {
"installer-paths": {
"wp-content/plugins/{$name}/": ["type:wordpress-plugin"],
"wp-content/themes/{$name}/" : ["type:wordpress-theme"]
},
"wordpress-install-dir": "wp"
}
}
I'm pretty new to this idea of using composer for package management inside WP. But I found this interesting, so I looked into it.
The installation path is specific to the plugin. Also look for wpackagist-plugin vendor name. Others will probably not put code inside the wp folder.
If you require "wpackagist-plugin/advanced-custom-fields": "^4.4" for example it is installed inside the plugins folder, as desired. The vendor prefix ('wpackagist-plugin') is important, I believe, as the packages in their search have no prefixes.
Quick solution
Try using "wpackagist-plugin/timber-library": "^1.3"
It is placed into the plugins folder nicely and comes with all it's dependencies inside the plugin folder.
Some more explanation
timber/timber is actually pulled from packagist.org (https://packagist.org/packages/timber/timber) instead of wpackagist.org.
On wpackagist.org you can find timber-library which is the packaged equivalent (includes composer autoloader and other deps).
This recipe for paths control (for plugin developers) says that:
To make use of it your extension's composer.json should contain:
"type" : "wordpress-plugin",
After you installed your packages using composer install you'll see, that the package inside of vendor/timber/timber doesn't have that type.
In fact there's an older WordPress plugin called timber, that's stuck on version 0.8. It's succeeded by timber-library.
Using "timber/timber": "^1.3" as from packagist.org
If you want to use timber as pulled from packagist.org you can do so, if you place the following line at the top of your wp-config.php:
require __DIR__ . '/wp-content/vendor/autoload.php';
Then you'll have to deploy both, the vendor and the wp folder.
There's also a discussion on GitHub about how to use vendor libraries in WP projects.
Hope you get along with that info.
I've read Magento's DevDocs and Googled this problem but the usual missing registration.php answer doesn't apply here.
I'm on the released CE 2.0.0 version and I'm simply trying to enable my first minimum test module in magento/vendor/ but
bin/magento module:enable -c Tchsl_Test
results in:
Unknown module(s): 'Tchsl_Test'
I am basing this on the naming conventions and file positions of modules in vendor/magento/
In vendor/tchsl/module-test/etc/module.xml I have
<config xsi:noNamespaceSchemaLocation="urn:magento:framework:Module/etc/module.xsd">
<module name="Tchsl_Test" />
</config>
In vendor/tchsl/module-test/registration.php I have
<?php
\Magento\Framework\Component\ComponentRegistrar::register(
\Magento\Framework\Component\ComponentRegistrar::MODULE,
'Tchsl_Test',
__DIR__
);
What am I missing?
I had this same problem and after some tinkering, discovered that our Modules actually belong under app/code/Tchsl/Test/. Move your module files there, and running the shell command module:status should show your disabled module.
You don't need to put your module under app/code/. In app/code/ Magento2 will search and find your module's registration.php. In Composer's vendor/ dir it doesn't do that, so we need to trick Composer into loading your module's registration.php.
If you'd check any Magento2 module's composer.json in vendor/magento/module-*, you'll see an "autoload" section which references the registration.php file. So Composer will autoload your module's registration.php which will "tell" Magento2 where your module is located.
This is a fragment from the Magento Checkout module's composer.json:
"autoload": {
"files": [
"registration.php"
],
"psr-4": {
"Magento\\Checkout\\": ""
}
}
So if you have your module in a separate repository and loaded via composer, then that is the way to go. If you do not have it in a separate repository, then your module does not belong on vendor/ but in app/code/.
See also this Magento Stack Exchange post.
I'm using grunt-contrib-clean to try to delete the _node_modules_ folder on the build machine at the end of a build.
But I recently added a new node module dependency, grunt-sass, which is causing the 'clean' task to fail, unable to delete the file node-sass\bin\win32-x64-v8-3.14\binding.node.
I think the node sass file is still in use, because if I comment out the grunt.loadNpmTasks('grunt-sass'); shown below, then the clean works.
Q1: Can I "unload" an npm task, sort of like reversing the `loadNpmTasks('grunt-sass')'?
Q2: Is cleaning up by deleting 'node_modules' at the end of the CI build the right thing™ to be doing?
Details:
Windows 7 x64
npm list
├─┬ grunt-contrib-clean#0.6.0
│ └── rimraf#2.2.8
gruntfile.js
grunt.loadNpmTasks('grunt-sass');
⋮
clean: {
post: [
'node_modules/grunt-sass' // was 'node_modules', but this is for debugging
]
}
results in error:
Running "clean:post" (clean) task
ERROR
Warning: Unable to delete "node_modules" file (EPERM, operation not permitted 'c:\myfold
er\node_modules\grunt-sass\node_modules\node-sass\bin\win32-x64-v8-3.14\binding.node').
Use --force to continue.
Note:
There's a grunt-clean-contrib github issue discussing this same exact issue (same file). It has some comments saying that upgrading rimraf from v2.2.7 to v2.2.8 fixed it for them. But I'm using 2.2.8 and still seeing the failure.