Is it possible to run ILMerge at compile time within SharpDevelop? - build-process

I'd like to offer my .Net library (which I'm developing in the SharpDevelop IDE) as a single dll. I've been manually using ILMerge to merge my compiled library and all its reference libraries together, but would like this done automatically.
I'd ideally like to have this automatic merge happen from within SharpDevelop, without having to set up an external build script. Is this possible?

SharpDevelop uses MSBuild to compile your code so the simplest way would be to create a post build step that runs ILMerge with the correct parameters. You can create a post build step from the Project Options under the Build Events tab. Alternatively you can directly edit your project file in Notepad.

Related

Location on nant build file in Visual Studio solution

I am just starting to use Nant and have a quick question -
When using Nant with a ASP.NET web application, where is the recommended place to put the build file? Should it be part of the web project or should it be just directly under the solution? Am I over thinking this?
I recommend you to place your build file along with the whole project. You also should create a separate directory for this build file (e.g. named Build), because in future you will have to create one more build (or .include) file and one more and then you will have a set of build files that should be grouped.

VS2010 Automatically rebuild minified .js/.css files

Problem:
I have been trying to integrate minification of javascript and css files in our VS2010 (.net 4.) projects. From what I hear, .net 4.5 and VS2012 will have minification build into the editor, so it will be as easy as setting a flag it will work. Unfortunately we are sill on VS2010 (.net 4.0).
Let me explain what I want to do and what I dont want to do.
I dont want to do big setups with classes/config file(s)/etc just to minify because all that stuff will have to be loaded on our build machine and even the build xml files might have to be modifies to make it work. Also, once we go to vs2012 and .net 4.5 all these configs/classes/etc will have to be discarded because vs2012 will have the build in functionality.
Here is what I think might be the best option. Since I am using the ScriptManager and it can already pull either a .debug.js (non-minified) or a .js (minified) script based on the build type, it seems all i need to do is to have some sort of (pre?) build event that will re-build a non-minified .js file into a minified one. Obviously the build event will have to call a minication module which would have to be installed on local computer (the YUI Compressor seems very nice). The module would update the minified .js file.
I have been reading about this, but I am getting a little bit lost. There are a lot of third party tools with bunch of setup and classes which I do not want to add.
Did anyone do something similar as I explained about?
If not what is the next best simple solution?
(By the way, if you are going to say move to VS2012/4.5, thats not a solution for us at this point)
Solution:
Thank you Parv Sharma for your answer.
I would just like to explain what I did so that it may help someone in the future.
I installed the Microsoft Ajax Minifier
Created a batch file to add minifer to ENVIRONMENT PATH variables: setx path /m "%PATH%;C:\Program Files\Microsoft\Microsoft Ajax Minifier"
Added the following pre-build events into my project:
ajaxmin $(ProjectDir)Script.js -out $(ProjectDir)Script.min.js -clobber
If Script.min.js does not exist, it will be created by the build event, but it will not be added to the project (not sure how to do that through the events).
When you add a new script file, mynewscript.js, just create a second blank file called mynewscript.min.js and add an pre-build event for it.
Using this approach the only thing you have to do to the build machine is run the Microsoft Ajax Minifier setup package and the batch file. Thats it everything else will be part of your pre-build events.
what you are looking for is probably this
http://ajaxmin.codeplex.com/documentation
by using this you would be able to use this third party tool as the minifier
after downloading the tool you have 2 options
1. edit the MSBUILD file to include building the js as per build event
OR 2. to attach this tool to VS and assign a key compbination to it.. this way you would be able to minify whenever you want just like we do F5 OR Cntrl-Shift-B
Attaching to VS is easy just to to external tools and in the Tools menu and add this tool with the required params

MSBuild - Perform all config transforms at once - and transform other files

I have been working on a build script for a website that we have. The website is a classic asp web app with an asp.net website in a folder of the classic asp web app.
We have different versions of the global.asa that need to be substituted instead of different web configs. We are in the process of moving to a continuous integration environment so much of this still new to me. I've written a build script that performs the following tasks.
Cleans the buildartifacts directory if it exists.
Builds the solution file with whatever configuration is passed in. This produces output with each project in a separate folder.
Copies the files into the required folder structure.
Packages up the result using MSDeploy as sync.
My first problem is this ...
When I run the MSBuild task like so ...
<MSBuild Projects="$(SolutionFileName)"
Properties="Configuration=$(Configuration);OutDir=%(BuildArtifacts.FullPath)" />
It builds the web app but does not apply any transformations. I would have assumed that MSBuild would apply the transforms automatically. Instead I end up with all 3 config files in the output folder that contains the build. Why is this? I've done some searching here, and here and they are using a separate task to perform the transformation. IF Visual Studio can apply the transforms and Visual Studio uses MSBuild I would think that MSBuild could apply transfromations? Isn't MSBuild configuration aware? Also if I have to do it separately, can I perform all transformations at once, if there are multiple config files in multiple folders at each level of the folder structure.
My second problem is ... being a classic asp web app we can't really use config files for this part of it because ... well I'm not sure how the classic asp web app would access the config file? So we have different versions of the global.asa file that would normally get replaced manually. I suppose I could do some sort of search / copy the specific asa files that we require at that time, but is there a way to perhaps use transformations to do this task?
Maybe this is not exactly what you need. I'm using XmlPreprocess tool for config files manipulation. I'm using one mapping file for multiple environments. You can edit mapping file by Excel. It is very easy to use.
You can call it from MSBuild script using Exec task.
Regarding the issue of transformations ... once we sorted out deployment with msdeploy we found out that msdeploy will actually perform the transformations on deployment. It stores the transformation data in one of the xml files that get created with the package.

How do you enforce dependencies among java folders in Netbeans?

I am new to Netbeans. I am wondering if someone can help me with project setup in netbeans. I am moving half million lines of Java code from a different IDE to Netbeans. I was able to get the code build and run in Netbeans easily. I have a project with many folders with dependencies among those folders. They have to be built in specific order. This is to enforce layering so that a module in lower layer cannot call into higher layers. I couldn't get that configured in Netbeans. Below is how my project looks like
project/
libA/
libB/
libC/
libD/
libE/
appA/
...
I have one project that builds all the libs and appA. The project build xml is stored under project/ folder. But the libs have dependencies among them. libB should be built after libA. libC after libA. libE depends on libD and libB etc.
I tried to change the order of source folders for libs in project properties. That didn't seem to make any difference. Even if I move libA after libB, it was building everything fine. I expected it to fail because libA didn't build yet.
Iam lost. Just wondering what the trick is to enforce this kind of dependencies. I created my project using "Java project using existing sources" wizard.
I appreciate your help
Thanks
Video guy.
Even though it would be a pain, you could just write your own ant build script and then just have Netbeans use that.
Basically:
write the custom ant build file
install the Ant plugin
create an Ant build file
right click the build file
run the selected target.
This would enable you to enforce whatever you need to do, but, if Netbeans is figuring out the correct order then why not just use it.
Does something break when you just compile and run in Netbeans?
Well! Lets say a team member added piece of code in lower level package that calls into higher layer code. It should fail because it breaks the layering. Because Netbeans seem to compile all the files in one javac invocation, the build compiles just fine. I want Netbeans to break the build in this case.
Writing my own ant script is another way of enforcing it. The whole point in using an IDE is to save yourself from writing your own make files (or ant scripts). This is something any IDE was able to accomplish 10 years back out of the box. I am wondering if I am missing something here.
Thanks
Video Guy

Best way to manage generated code in an automated build?

In my automated NAnt build we have a step that generates a lot of code off of the database (using SubSonic) and the code is separated into folders that match the schema name in the database. For example:
/generated-code
/dbo
SomeTable.cs
OtherTable.cs
/abc
Customer.cs
Order.cs
The schema names are there to isolate the generated classes that an app will need. For example, there is an ABC app, that will pull in the generated code from this central folder. I'm doing that on a pre-build event, like this:
del /F /Q $(ProjectDir)Entities\generated*.cs
copy $(ProjectDir)....\generated-code\abc*.cs $(ProjectDir)Entities\generated*.cs
So on every build, the Nant script runs the generator which puts all the code into a central holding place, then it kicks off the solution build... which includes pre-build events for each of the projects that need their generated classes.
So here's the friction I'm seeing:
1) Each new app needs to setup this pre-build event. It kind of sucks to have to do this.
2) In our build server we don't generate code, so I actually have an IF $(ConfigurationName) == "Debug" before each of those commands, so it doens't happen for release builds
3) Sometimes the commands fail, which fails our local build. It will fail if:
- there is no generated code yet (just setting up a new project, no database yet)
- there is no existing code in the directory (first build)
usually these are minor fixes and we've just hacked our way to getting a new project or a new machine up and running with the build, but it's preventing me from my 1-click-build Nirvana.
So I'd like to hear suggestions on how to improve this where it's a bit more durable. Maybe move the copying of the code into the application folders into the NAnt script? This seems kind of backwards to me, but I'm willing to listen to arguments for it.
OK, fire away :)
How often does your DB schema change? Wouldn't it be possible to generate the database-related files on demand (e.g. when the schema changes) and then check them into your code repository?
If your database schema doesn't change, you can also package the compiled *.cs classes and distribute the archive to other projects.
We have two projects in our solution that are built completely out of generated code. Basically, we run the code generator .exe as a post-build step for another project and along with generating the code, it automates the active instance of visual studio to make sure that the generated project is in the solution, that it has all of the generated code files, and that they are checked out/added to TFS as necessary.
It very rarely flakes out during the VS automation stage, and we have to run it "by hand" but that's usually only if you have several instances of VS open with >1 instance of the solution open and it can't figure out which one it's supposed to automate.
Our solution and process are such that the generation should always be done and correct before our auto-build gets to it, so this approach might not work for you.
Yeah I'd like to take VS out of the equation so that a build from VS is just simply compiling the code and references.
I can manage the NAnt script... I'm just wondering if people have advice around having 1 NAnt script, or possibly one for each project which can push the code into the projects rather than being pulled.
This does mean that you have to opt-in to generate code.

Resources