How to serve new DLLs directly from NuGet after each CI Build? - asp.net

I wish this is a stupid duplicate of an already answered question.
I have a asp.net website that depends on some other projects (dlls copied to bin). Now, what I want is every time any of those projects are updated, I get latest dlls in my website/bin. I DO NOT want my CI server to check-in updated dlls.
I already have a private NuGet feed for my project, and just want it to serve the latest dlls after each successful CI build. Now, my questions are
Is there a way to directly serve the dlls, without creating nupkg? And probably pick them from build output folder? (for some reasons, it's not that convenient to create package as a post build task for all the dlls hundred times a day) If that is possible, awesome!
If not, can we avoid increasing version number of dlls each time, still make nuget update to the new dlls? Something like update based on latest publish date or something? (there is huge bunch of dlls, and lot of dependencies)
Is there a way to take latest dlls without building the solution? Yeah, I can do a nuget update command, but is there any other way?
Someone suggested mirroring my current code base and using something like MyGet or ProGet. For several reasons, that is not feasible at the moment.

Triggering a Visual Studio build after any NuGet dependencies is probably not quite what you really need - that's a job for CI. However, you can set the version ranges in your packages.config file to make VS (via nuget) pull newer NuGet packages when available.
To answer your specific points:
Why would you want to server 'random' loose DLLs whose origin you cannot be certain of? NuGet provides a mechanism to track the origin of code on which your own code depends, which makes tracking down bugs easier :) If you rely on NuGet packages containing DLLs which change 'hundreds of times a day' then you should likely just build those DLLs directly with your application.
See #1 - if you are re-building NuGet packages very often, then you likely have your package boundaries wrong. Consider how truly independent your packages are, and see if it would makes sense to bring some of the DLLs together, or even separate out (fork) code which is shared between multiple separate applications. If you create a new version of a NuGet package, then you should increase the version number - that's a fundamental premise of semantic versioning, and you'll get into a mess if you do not follow this pattern.
To bring down the latest NuGet dependencies, nuget update is your friend :)
Using MyGet or ProGet might be part of a solution, but it's not directly related to the patterns you mention above.

Related

MSIX: How to achieve automatic install of .net 5 required for my application?

I wrote a WPF program using .NET 5, packed it into the MSIX bundle (Release, x86 and x64) as a framework-dependent package. Everything seems fine, but there is one very annoying thing: on the first run the app says ".NET runtime is missing, would you like to install it?”. If you click yes, the download page opens, where the user has to select the needed runtime, download, and install it. Not the best user experience, I'm thinking about how to improve it.
Is there an option to add .net 5 runtimes (x86 or x64 depending on the user system, or maybe both) as a dependency so it installed automatically?
I know I can define dependencies, but how can I find the right name for the needed dependency?
Also, I know it's possible to define custom install action but I haven't tried it yet, because I want to find a simpler solution. Looks like for that option I'll have to create a small app or script that will check if the needed runtime exists and if not - check the platform and ask the user to install the specific version of the runtime. Not the best user experience too.
Of course, I still have an option to go with self-contained, but I don't want to distribute so many megabytes of .net every time, especially given the fact that I expect frequent updates.
Luckily, I got an answer on techcommunity.microsoft.com
Thanks to Matteo Pagani:
if it's an application based on .NET Core / .NET 5 (as I seem to understand from the description), the suggested and best way to distribuite via MSIX is to use the self-deployment approach. Thanks to MSIX features like differential updates and single disk instance, you don't have to worry too much about the increased size, since the runtime will be downloaded only at the first install.
Dependencies are not a good fit because there are no packages for .net 5 yet.
Custom install actions are possible but more complicated, so I decided to go with self-contained.

Should project.lock.json file be checked into source control? (ASP.NET Core 1.0)

Using ASP.NET Core 1.0, is it best practice to check in the project.lock.json file into source control?
Short answer: No, project.lock.file should not be checked into source control - you should configure the version control system to ignore it (i.e. add it to .gitignore if you're using git).
Long answer: The project.lock.json contains a snapshot of project's whole dependency tree - not just packages listed in "dependencies" sections, but also all resolved dependencies of those dependencies, and so on. But it is not like ruby's Gemfile.lock. Unlike Gemfile.lock, project.lock.json doesn't tell dotnet restore which exact versions of packages should be restored - it simply gets overwritten. As such, it should be treated like a cache file and never be checked into source control.
If you check it into version control, then most probably on other machine:
dotnet will think that all packages are restored, but in fact some packages might be missing and the build will fail, without hinting the developer to run dotnet restore
project.lock.json will be overwritten during dotnet restore and in most cases will be different than the version stored in source control. So it will be modified in almost every commit
project.lock.json will cause conflicts during merge
Actually you do want to commit your project.lock.json in git sometimes.
Checking your project json
For the exact reasons that, it shows you the dependencies you have used. Say:
Me as a developer works on an application, i hate every time updating packages so i add a package dependency to nuget package X = 1.*
I restore package i get version 1.2.4
The package maker just made a very stupid mistake, he broke something while just trying to make a fix and release 1.2.5
Person 2 checks out (or even worse release build kicks in).
Person 2 restores and gets version 1.2.5
Person 2 runs your application and find the application is broken.
Person 2 starts debugging and thinks there must be a bug in the software.
At this step 7 Person 2 could have seen in git that his lock file was changed and a newer version of a library has been downloaded, Which has not been tested by any of the other developers!
Downsides
Downsides of checking in this file is you might get allot of merge conflicts on continues updates of packages.
Alternative solution
Use only hard version dependencies (this is quite hard though for nuget). And only manually update to newer version once in a while.
Downsides
This doesn't work if you build a library for other people to use, since you pin them to a certain version of your dependencies.
Dependencies of dependencies still get resolved automatically so if you don't specify them yourself you can't guarantee there version on dotnet restore
Conclusion
If you want to avoid 'Works on my machine' quotes and the hell of constantly manually updating to newer version: Checking the project.lock.json.
And also build a CI/Release build check to test if this file wasn't changed compared to git, before you release (If your software is very critical)!
If this is not a problem and also automatically updating (to a potentially broken package) is not a big problem, you might not want to commit your project.lock.json.
No, it is just a lock file, really you should never check it in when a lock file exists (except if the program who locked it wants to check it into source control, in that case, exclude your lock file!).

Can't create new projects in VS2013 -- most references are missing

About a week ago I noticed strange behavior with my install of Visual Studio 2013 Pro. Creating new projects always results in missing references to EntityFramework and most of the Microsoft.* components. I had reinstalled .NET 4.5 in repair mode around that time but can't recall if this problem happened before or after that install.
As it stands, I can no longer create a functioning project. I have an existing project I'm working on that will compile and run without issue, but creating any new projects (which I need for spike solutions etc) is no longer possible until this is fixed.
Screenshots follow. These are all from creating a new MVC project with all defaults accepted.
References list showing missing references
Error list upon building
Reference paths are empty (this was mentioned in another answer that did not directly address my specific question, so I'm including it)
Regedit showing .NET versions installed
Even though I have "repaired" .NET 4.5 it appears from regedit that I only have up to .NET 4 installed? Am I reading that correctly?
Also, due to network restrictions I cannot download packages from Nuget automatically -- I have to download them manually from a laptop off-network and then sneakernet them over to install. The network physically blocks all connections to Nuget, github, etc.
If allowing VS to connect to Nuget is the only viable option then I have considered installing VS on the laptop, creating the project there and installing all necessary dependencies, and then moving the project folder over to the restricted computer and continuing from there. But I don't know if that is a solution to this problem or not.
Any advice appreciated, thanks.
.
The network blocks all connections to Nuget, github, etc.
It's almost like they don't want you to be productive.
Anyway the project templates (which you seem to be talking about) reference specific NuGet packages. Packages by default are stored relative to your solution.
Place a nuget.config in your disk's root (or any point into your projects directory, if you keep them organized like C:\Dev\Visual Studio\Projects, then each of those subfolders will be file) and point in that file to a shared package directory on your development machine. Here you can dump all packages you require.

STS: Losing references in Java Build Path

I am using Spring Source Tool Suite 2.8.1 to implement Spring applications.
I frequently get build errors because references are lost for no apparent reason. In Right-click project in Package Explorer->Properties->Java Build Path->Order and Export, I find projects sometimes are deselected. And often packages are gone in Right-click project in Package Explorer->Properties->Java Build Path->Deployment Assembly.
Having to reset these settings frequently is frustrating. Is there some way I can work around these problems?
I have tried to update STS to the latest version, but the upgrade process fails with incomprehensible error messages. I want to avoid a clean install because setting up the environment again would probably be a nightmare.
Now that I know this is a maven project and you are adding references yourself, this is making sense to me. STS 2.8.x was the last STS to ship with the legacy m2e (maven plugin for Eclipse). It did not recognize build path entries added manually (it likes to have complete control over the classpath). So, what is likely happening is that you are adding these classpath entries and then an update project operation gets kicked off automatically. This will have the effect of removing all of your extra classpath entries.
You are best off doing the following:
Updgrading STS
Or just upgrading your m2e component (you will have to first uninstall the old m2e, but this should be taken care of automatically from the discovery update page).
Or, just accept the fact that you can't manually change your classpath with the legacy m2e.

Automatic BizTalk Versioning in My Build Process

In all of my other .net apps my build process (a mixture of nant and custom tasks) automatically updates the [AssemblyVersionAttribute] AssemblyInfo.cs with the current build number before the call to msbuild, stamping in the build number in the version number.
I'm now working on my first BizTalk project and I'd like to do the same thing with the version numbers of the BizTalk assemblies, but I've run into trouble!
First of all the aseembly version numbers are stored in the btproj files, so I did some googling and found www.codeplex.com/biztalk which looked like the answer to my problem, but there is a deeper problem!
I have a project for my schemas and another for my pipelines, the pipelines project references my schemas project as I have a flat file dis/assemblers. The problem comes when I update the version numbers, as updating them even from within visual studio does not update the pipeline components references to the schemas.
So if I update all the version numbers manually in the VS IDE from 1.0.0.0 to 1.1.0.0, the build fails as the pipeline components flat file dis/assemblers still reference the old 1.0.0.0 version of the schemas! They don't automatically update!
Is this really a manual process of updating the version numbers of the BizTalk projects in the property pages, then building the projects and manually updating the references to them in the properties of all the pipeline components that reference them?
This means that I can't have my build process control the build number part of my version numbers!
Or is there a better method of managing the version numbers of the BizTalk assemblies?
I'm sorry to disappoint you but I've been down the exact some road I had to give up. I guess it could be possible to achieve it but it would require a lot of changes to both the binding files and other XML files (as you mentioned and even more if you have published services etc).
Maybe it could be possible to wrap all these necessary changes in a build step (a MSBuild step or similar in other build frameworks) - that would be useful!
Developer- :)
We had the similar problem and we ended up developing a small utility which would change the version number in all the projects i.e. *.csproj (asssemblyinfo.cs), *.btproj accordingly. Apart from this it would open and modify the *.btp files with the new version of schemas. In nutshell, what all you have to do is to configure this utility in your VS.net tools menu and execute it.
I guess its not very difficult to develop such utility in any .net lanagauge.
Caveat: Do not forget to save the files after updates with the same encoding as they were originally.
Cheers!
Gutted, thought that might be the case. Maybe BizTalk 2009 projects will play more nicely when updating references when changing version numbers.
I started to go through and automate it manually, and when I realised what needed to be done, I took a biiig step back when I realised just how many places I'd have to modify to get it working. Thank god for Undo Checkout.
I do have a standard C# class library included in my project (various helper functions), which i am able to update the version number of during my build process, so I'm basically using that one assembly to version the whole application. If anyone wants to know what version is in any environment, check out the version number of that one assembly.
Not ideal, but it's working.
We've done this successfully on our project - I'll see if I can get the developer of the tool to post details...
This problem arises when you perform an integration build to the latest versions of your dependent components as file references (aka schemas here).
Keep in mind that upgrading the assemblyversion must always performed manually, that way you are always in charge of changes to assemblyversions.
A possible solution to solve the buildbreaks issue is to file reference to a specific version of a dependent component build and not to the latest version and use a subst drive and a copy script to get the latest component builds.
For example:
SchemaA, assembly version 1.0.0.0
PipelineA (with pipelinecomponent XMLValidator for example), assembly version 1.0.0.0
PipelineA has a file reference to a subst drive(say R drive, which maps to a workspace D:\MyComponents) and version 1.0.0.0 of SchemaA as follows:
R:\SchemaA\1.0.0.0\SchemaA.dll.
The copy-script copies the buildoutput of SchemaA locally to your R drive.
When schema A updates to version 1.1.0.0 you don't have any issues because you still use version 1.0.0.0 and YOU have the choice to use the 1.1.0.0 version of your schema. When you want to upgrade, you have to alter your copy-script and replace the file reference to R:\SchemaA\1.1.0.0\SchemaA.dll.

Resources