Why do Grunt/Gulp plugins take up so much space? - gruntjs

I have been using Prepros over the last few months to compile, minifiy, build etc. my projects. But I have found that it is, at times, quite clunky or isn't as customizable as I would like. So I am trying to find a more complex, stable and customizable build system. So have played around with both and love how customizable it is, but the size of the plugins that live inside the project are massive being on some projects 70+ mb of plugins.
So how come I can't just install my most used dependencies locally, since I am always working on multiple projects and these plugins folders will start to add up over time. Also, is there a way to have the flexibility of grunt or Gulp without having this large amount space taken up with plugins.

So how come I can't just install my most used dependencies locally
You can just install your dependencies locally. However, if you want to reuse development dependencies across all your projects you may want to consider installing them globally.
If you were to use Node.js and the node package manager npm you would be able to do just that. You can run Gulp and Grunt effortlessly from there.
Now, I am guessing that you are not really concerned about 70 Mb worth of plugins as most, if not all in my case, are just tools that I use to build my web app.
It seems to me that the itch to have full power over your development environment has gotten under your skin. In that case welcome to the club. My recommendation would be to use Bower as your app dependencies manager and npm as your development dependencies manager.
FYI: My node_modules folder is up to 140 Mb and it will probably grow as I use more and more tools. My bower_components folder is up to 43 Mb. From there I use angular and a bunch of angular modules, bootstrap, font-awesome, lodash and others. My debugging deployment size is 23 Mb. That's shockingly big right? Well, after all my optimization, minification, concatenation, obfuscation and so on my release/dist size is 2 Mb with 1.2 Mb worth of images and fonts.

Related

ASP.NET - install React and typescript

So have created an ASP.NET 4.5.2 project and now need to install react and typescript. I installed node.js so wondering if its best to install via that. Also because I will be using TypeScript I will need the .d.ts files is there an easy way to install these in the project locally? Cause I assume everything else will be installed globally by npm as I might use them in other projects?
One other thing I am confused by all the different types of react packages available on npm, do i need a few or just one of them? I have worked on many projects involving this kind of tech stack but they are established and have never created one from scratch like i am doing now. So some really informative links or tips here would be immensely helpful! :)
So using Visual Studio 2017 I followed this tutorial and managed to get it working. The only issue left now is that i need to call webpack cmd on the project root when i make changes before refreshing the site. I am fine with this and will look into further into automating it as it kind of is a different and unrelated question.
One thing I will include is to always install npm packages globally (most of the time anyway) and just link them in using npm link. Was quite useful considering I went through the process a few times creating the project from scratch over and over again until I understood it all.

Why Meteor uses it's own Package Manager Atmosphere and not npm?

Since Meteor is pure JavaScript, why does it not use CommonJS modules or NPM packages and rather it introduces one of it's own called Atmosphere?
While it is true that meteor is pure JavaScript (JS), a ton of that JS is custom-developed to operate in the meteor framework, so it makes sense (to the meteor developer team) to have a website that provides a catalog (and API) for meteor-specific JS libraries.
It's perfectly fine to use npm while developing a meteor app, but there is a meteor-specific ecosystem around the JS libraries in atmosphere, that make it easier for developers to find meteor-specific JS libraries.
It is quite common for packaging systems to be created for a specific development environment/purpose not only for technical reasons, but also (and sometimes mostly) for social reasons.
For example, jar files are really tar files, but having a distinct suffix (and "type") helps the Java apps and developers recognize their own packaging format. Similarly, Debian .deb files are packages specifically for Debian Linux, while Centos/Redhat use RPMs as a packaging format -- even though the contents are effectively identical.
So, Atmosphere is a website for cataloging & delivering meteor-specific JS libraries and apps, for technical reasons as well as a marketing tool to increase awareness of meteor's ecosystem.
There's also a Meteor package, maybe a few others out there too, that allow you to add npm packages to that Meteor package and you're able to make use of it in your Meteor app.
https://github.com/meteorhacks/npm
Also found this on npm. Basically the same thing, it seems:
https://www.npmjs.com/package/meteor-npm

Best way for deployments with builds, dependency managers and GIT? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I'm currently working with Git, Git Flow, Gulp and Bower. I'm working on the develop branch and create releases with Git Flow to the master branch. So develop is equal to my local and test environment, release/version to the acceptance environment and the master branch is equal to the production environment. See: http://nvie.com/files/Git-branching-model.pdf. On every environment runs Git, so deploying is nothing more than: git pull origin master.
I've got some dependencies like Bootstrap and Font Awesome handled with Bower and I'm using Gulp to watch .less files for "compiling" to css, minifying css/js, etc.
Now toward to the questions: what should be in my Git repository? Let's say I'm working on a Magento project, it would be overkill to put Magento and all dependencies from Bower in the repository. Currently I'm excluding only the node_modules (for Gulp) and bower_components (contains dependencies) directories, when I run Gulp the .less files from Bootstrap will be "merged" and "compiled" with my project related .less files. The "compiled" .css file is currently included in my repository, else it's not possible to do a deploy with just git pull. To get it working without having the "compiled" version in Git I've to run Gulp on the production server.
What is the best method for not having my platform (Magento or Wordpress) in my Git repository, but keep the possibility to easily update? I came across this solution: http://blog.g-design.net/post/60019471157/managing-and-deploying-wordpress-with-git where they're using Git Submodules. Nice solution, but this way the platform needs to be in a sub directory. Not ideal because I've to "hack" to get it working that way (copy the index.php and change the include paths, etc).
What about plugins/modules? 3th party plugins shouldn't be in my repository too? Only the plugins I've created by myself. But not all 3th party plugins does have a Git repository to use with for example Git Submodules. For Wordpress it's just one directory, so in theory it's possible but for Magento the most plugins aren't just one directory (they have files in the app/code, app/design, skin, etc directories). I've a lot of Wordpress and Magento sites with multiple matching plugins/modules now every plugin/module is in each sites repository.
Should "compiled" files be in a repository? If you ask me: no, but I'm currently doing it so it's easy deploying. Is it general to have Bower and run Gulp on a production server for dependencies and to "compile" on the production server right after a git pull? Continues running a Gulp watcher (like I do locally) on production takes some extra unnecessary resources?
I hope someone can put me in the right direction.
It's difficult to provide a universal answer that works for Magento, WordPress, and other platforms. My answer primarily targets Magento, but I'm sure that similar concepts could be applied to WordPress and other platforms.
It's possible to automatically install Magento as a dependency using Composer with magento-composer-installer. You can either use a public mirror, like this one, or set up your own repository. Once you've installed Magento itself as a dependency, it should be very straight forward to update it, just like any other dependency.
You can use Composer for modules as well (after all, Magento itself is just a bunch of modules and a few scripts to glue everything together). FireGento has a lot of common Magento modules which can be used with Composer out of the box. You can also set up your own repositories to use with Composer (we do this for internal modules that we use for multiple sites).
As for modules that don't have their own repositories, well, you have three options: don't use them (the less modules, the better), create your own repositories for them, or just commit them along with the rest of your codebase.
In an ideal world, your repository should only consist of your own source files. Everything else - whether it's compiled, installed through a dependency manager, or otherwise somehow automatically created - should not be in the repository.
But we don't live in an ideal world - it's painful to run Composer, Bower, Git, Gulp, Sass, and everything else that you're using on each and every environment that you want to deploy to (development machines, testing server(s), staging server(s), production server(s), and so on).
And what if those dependencies have dependencies? What if you're using a Gulp plugin that works well on one of your servers, but fails on another one? What if someone makes a deployment and forgets to run some of the required builds via Gulp? Of course, the answer to these questions is testing and automation. You should be able to click a button (or type a command, for the purists) and have everything automatically deploy - a git pull is issued, the appropriate gulp commands are run, and so on. But unless you have the manpower and manhours to set up such a sophisticated system, it's not worth it.
Overall, we use a combination of the different points above. In the end, we end up committing (almost) everything to the repository - resulting in deployments as simple as git pull or svn up. Of course, configuration files (credentials, .htaccess files, and so on) don't go in the repository, and neither do fingerprinted files (which we manually upload).
Fabrizio Branca has an excellent blog post here which goes through many of the described points in order to clean up and upgrade a Magento setup.

Is it a good idea to hook up composer to manage web assets in symfony2?

As you know, in Symfony2.1 php bundles and packages are managed by composer, but would be maybe a good idea to hook up the managing of web assets as well? I would really love to update Twitter Bootstrap, jQuery, jQueryUi, Underscore.js and many other libraries using the same console command i use to update the php packages.
Are there any serious downsides of doing this?
Well, it sounds like a great idea, but I don't think it would be possible:
Composer is created for handling PHP dependencies, not for handling front-end dependencies, the twitter team has created Bower for front-end dependencies.
Combining those 2 great libraries is a huge task: You will need to create your own composer commands and configuration files.
Bower puts everything in a components directory. This isn't the correct dir for web assets, you will need to change this. You can't change this in the Bower config, as far as I know about Bower, which is almost equal to zero. UPDATE As said by #xanido, you can configure the output directory with the directory option as of Bower 0.3.0.
So well, you can manage web assets in Symfony2, with Bower (and maybe other programs like that), but combining those 2 isn't a good practise. Use Bower and Composer seperately can be useful, although you get another web assets directory.

Best practices for organising structure/running "builds" on a large solution

I am making a very large web app (currently at 70 projects and 150k loc but with a lot more to do).
I use FinalBuilder to run build scripts. However, what are the best practises for structuring such a large project? What about build dependencies? What effect does the structure of my projects have on the performance on the code (if any)?
I've seen some previous threads about this but I can't find those. I've seen threads about solutions exceeding 600 projects in the solution, for the sake of clear answers, lets imagine this system will grow to that size (I would like to know how to organise a project bigger than what mine ends up to be, as it would mean I can organise a smaller solution).
If it matters, the system is mostly in .NET 3.5 (C#, LINQ, SQL Server etc) but will use Python/Erlang too.
I have only 40 projects (but several millions of loc), and the main best practices we have is:
identify dependencies between projects
establish a global list of labels used by all projects wishing to participate to the next release
make sure that every project willing to publish a label of its own into this global list has made that label from a configuration (list of labels) coming from the global one
register the "officials builds" (the one potentially to be deployed into production) into a repository.
That way:
developers works and compile their code directly against the deliveries of the other projects they depends on (as opposed to download the sources of the other projects and rebuild all in local).
They only have the right deliveries because they know about their dependencies (both immediate and transitive)
testers can deploy quickly a set of deliveries (from the global list of labels) to perform various tests (non-regression, stress-tests, ...)
release management can deploy those deliveries (after having a final global build) onto pre-production and production platforms
The idea is to:
not rebuild the delivery at every steps
build it only at the development stage (through a common unified building script)
build it again before release (for pre-production and production platform)
compile and/or test only against those deliveries (and not against sources downloaded and re-compiled for the occasion: when you have more than a few projects, it is just not practical)
Main best-practice:
If your own project works with the deliveries of the other projects (and not with your local re-build of those other projects), it have good chances to work in the next steps of the software production life-cycle (test, pre-prod, production)
Have you considered using NMaven and making each of the 70 projects a module? That would allow you to control the building, packaging, versioning, and release of individual modules and the parent project as a whole. It would also help you resolve the depedencies between the different modules, external libraries, and even versions and different lifecycle scopes (for example, you only need NUnit during the testing lifecycle, but don't need to package it in the build).
It might help to explain in greater detail what these projects look like and how they depend on each other.
A bit open as a question. Let's start with a basic structure I suggest as a starting point to my customers, inside a branch I have
Build Scripts
Build Dependencies - things to install on a build machine
Libraries - LIB, DLL, ... directly referenced from projects
Documentation Sources - help sources
Sources
Deploy Scripts
then Sources is organized in
Admin - admin console and scripts
Database - schemas, scripts, initial data
Common
Web
NTServices
Services - REST/SOAP services
BizTalk - you name things specific to a product

Resources