How to improve build times for Polymer 3.0 - polymer-3.x

We are currently developing a web application based on polymer. Our app consists of about 250 custom elements. Up to Polymer 2 build times using vulcanizer where really fast, not more than some minutes. We migrated the whole project to Polymer 3, and now the bundled build takes about 40 minutes. If we do an unbundled build it will quite be as fast as Polymer 2. Unfortunately "polymer build" isn't really helpful with verbose information about what is going on... Any idea?
Tried already excluding all folders containing not needed files
We need to get build times faster for frequent releases

Related

Why do Grunt/Gulp plugins take up so much space?

I have been using Prepros over the last few months to compile, minifiy, build etc. my projects. But I have found that it is, at times, quite clunky or isn't as customizable as I would like. So I am trying to find a more complex, stable and customizable build system. So have played around with both and love how customizable it is, but the size of the plugins that live inside the project are massive being on some projects 70+ mb of plugins.
So how come I can't just install my most used dependencies locally, since I am always working on multiple projects and these plugins folders will start to add up over time. Also, is there a way to have the flexibility of grunt or Gulp without having this large amount space taken up with plugins.
So how come I can't just install my most used dependencies locally
You can just install your dependencies locally. However, if you want to reuse development dependencies across all your projects you may want to consider installing them globally.
If you were to use Node.js and the node package manager npm you would be able to do just that. You can run Gulp and Grunt effortlessly from there.
Now, I am guessing that you are not really concerned about 70 Mb worth of plugins as most, if not all in my case, are just tools that I use to build my web app.
It seems to me that the itch to have full power over your development environment has gotten under your skin. In that case welcome to the club. My recommendation would be to use Bower as your app dependencies manager and npm as your development dependencies manager.
FYI: My node_modules folder is up to 140 Mb and it will probably grow as I use more and more tools. My bower_components folder is up to 43 Mb. From there I use angular and a bunch of angular modules, bootstrap, font-awesome, lodash and others. My debugging deployment size is 23 Mb. That's shockingly big right? Well, after all my optimization, minification, concatenation, obfuscation and so on my release/dist size is 2 Mb with 1.2 Mb worth of images and fonts.

Better alternative to Web Deploy Projects

I have a solution with a fair few projects, 3 of them web-based (WCF in IIS / MVC site). When the solution builds, it dumps each of the components of this distributed system in a 'Build' folder. Running the 'configurator' part of the whole output will set up the system in the cloud automatically. It's very neat :) However, the Web Deploy Projects are a major pain. They "build" (i.e. deploy) every, single, time I build - even when no changes have been made to their respective projects.
Changed a single line of code? Look forward to waiting around a minute for the 3 web projects to redeploy.
[These projects are VERY straightforward at the moment - two have a single .svc and one .ashx file - the other is an MVC app with ~5 views]
I realise I can change solution configurations to not 'build' them, but I've been doing that and it's very easy to log on the next day and forget about it, and spend a couple of hours tracking down bugs in distributed systems due to something simply having not been built.
Why I use Web Deploy Projects? Well, because I need all pages + binaries from the web project. The build output for the project itself is the 'bin' folder, so no pages. The entire project folder? It has .cs, .csproj and other files I don't want included.
This will be building on build servers eventually, but it's local at the moment. But I want a quick way of getting the actual output files from the web project to my target folder. Any ideas?
Not sure if this will help in your situation, (plug for own project coming up), but I am working on a project to help ease IIS deployments:
https://github.com/twistedtwig/AutomatedDeployments
The idea being you can use config files for IIS (app Pool, applications and websites) to automate the creation and update of sites locally (dev machines) or remotely (test and production machines).
It is still a work in progress but is ready to be used in production systems.
using the package creation as a post build step might get you closer to what you want, (don't believe it includes all the extra files), but that would still build it each time, (although if code hasn't changed it should not rebuild unless you choose rebuild all projects).
In the end I created a utility/tool which, given a project file, XCOPYies the project folder for the web project to a target location, then looks in said project file and deletes anything that doesn't have Build Action set to Content. Very quick and effective.
I know it is still in RC but VS2012 does have a neat feature when doing publish that it detects the changes and publishes only those. Might be something a little deeper down in the build where it does an automatic publish too.
You can take a look to the Octopus project: http://octopusdeploy.com/
Deployment based on nuget packages.

SWC compilation much slower than SWF

We're converting quite a large Flex project from a monolithic web application project type to multiple projects where most of the existing files (~2000) will newly live under a Library project and there will be multiple runnable projects beside it (like web version and a desktop version).
What concerns us is the compilation time. It used to be something like a minute (already more that we would have liked) but now it easily takes ~10 minutes which is unacceptable.
Is it expected to see such a big difference when moving from MXMLC compiler to COMPC? I know MXMLC optimizes its usage and will skip files that are not used in the application it is expected that the COMPC build would be slightly slower as it compiles more files but I don't think the difference should be 10 fold, should it?

Complex ASP.NET web applications and nant

Working on an intranet where we have about 20 different web apps - some .net, some classic asp.
Currently each .net app is its own solution. There are advantages to this - we can build & deploy just one app, without affecting other apps, and all the apps share a session - but we can't use master pages, and there are real challenges using localization resources, shared css and js, etc. Build & deployment is done completely manually, which is a real problem.
I'm trying to set up a structure that will allow us to take advantage of VS2008 features, but still have the ability to update one app without affecting the others while still using features like master pages and localization resources, and sharing session between apps (so we can't set up virtual directories for each app).
If I set up single solution that looks like:
/Root
- App_GlobalResources/
- shared
-- masterpages/
-- css/
- App1/
- App2/
...
- AppN/
..
- ClassicASP1/
then the problem is that the build just produces a single DLL (Root.dll) - this will simply not scale to 20+ apps, all of which have different development cycles.
Is it possible (using nant, or some other build tool) to build multiple DLLs? In this case, I'd like to end up with Root.dll (contains the global resources at least) and App1.dll and App2.dll.
Any other suggestions or references I should look at?
I'm not sure you can do what you want to do, sadly. VS tends to make one DLL per unique project (not solution), and it appears you have just one project, so hence, one DLL.
I'd suggest you keep one project (csproj) per application, but use NANT to build them all (ie, one at a time, together, in order), and package them all up for deployment. That way you can do a single point deployment, but still keep the apps seperate.
I'm surprised you can't use master pages in the sub-folders. You'd need to replicate them for each AppN folder, but again - NANT could be used to pull those in from a common place when you build your deployment package.
Writing a build and deployment script takes a while to get right, but I've found that once it's done, it pays for itself very quickly - even if the only payment is your sanity!
There is a solution to this problem. In short, it entails creating a Web Site Project (which can have the masterpage and whatnot) and several subdirectories, each containing a web project. In the main web project you exclude the subdirs from the project. You then add the project files to the solution. This (updated) link tells you all about it.
-Edoode
I would advise using MSBuild instead of Nant. It is more native to visual studio.

Best practices for organising structure/running "builds" on a large solution

I am making a very large web app (currently at 70 projects and 150k loc but with a lot more to do).
I use FinalBuilder to run build scripts. However, what are the best practises for structuring such a large project? What about build dependencies? What effect does the structure of my projects have on the performance on the code (if any)?
I've seen some previous threads about this but I can't find those. I've seen threads about solutions exceeding 600 projects in the solution, for the sake of clear answers, lets imagine this system will grow to that size (I would like to know how to organise a project bigger than what mine ends up to be, as it would mean I can organise a smaller solution).
If it matters, the system is mostly in .NET 3.5 (C#, LINQ, SQL Server etc) but will use Python/Erlang too.
I have only 40 projects (but several millions of loc), and the main best practices we have is:
identify dependencies between projects
establish a global list of labels used by all projects wishing to participate to the next release
make sure that every project willing to publish a label of its own into this global list has made that label from a configuration (list of labels) coming from the global one
register the "officials builds" (the one potentially to be deployed into production) into a repository.
That way:
developers works and compile their code directly against the deliveries of the other projects they depends on (as opposed to download the sources of the other projects and rebuild all in local).
They only have the right deliveries because they know about their dependencies (both immediate and transitive)
testers can deploy quickly a set of deliveries (from the global list of labels) to perform various tests (non-regression, stress-tests, ...)
release management can deploy those deliveries (after having a final global build) onto pre-production and production platforms
The idea is to:
not rebuild the delivery at every steps
build it only at the development stage (through a common unified building script)
build it again before release (for pre-production and production platform)
compile and/or test only against those deliveries (and not against sources downloaded and re-compiled for the occasion: when you have more than a few projects, it is just not practical)
Main best-practice:
If your own project works with the deliveries of the other projects (and not with your local re-build of those other projects), it have good chances to work in the next steps of the software production life-cycle (test, pre-prod, production)
Have you considered using NMaven and making each of the 70 projects a module? That would allow you to control the building, packaging, versioning, and release of individual modules and the parent project as a whole. It would also help you resolve the depedencies between the different modules, external libraries, and even versions and different lifecycle scopes (for example, you only need NUnit during the testing lifecycle, but don't need to package it in the build).
It might help to explain in greater detail what these projects look like and how they depend on each other.
A bit open as a question. Let's start with a basic structure I suggest as a starting point to my customers, inside a branch I have
Build Scripts
Build Dependencies - things to install on a build machine
Libraries - LIB, DLL, ... directly referenced from projects
Documentation Sources - help sources
Sources
Deploy Scripts
then Sources is organized in
Admin - admin console and scripts
Database - schemas, scripts, initial data
Common
Web
NTServices
Services - REST/SOAP services
BizTalk - you name things specific to a product

Resources