I'm new to GitVersion and not fully understand how it works.
GitVersion for develop branch shows 4.2.0.
Develop is merged to master.
GitVersion for master shows 4.1.2
What i need is 4.1.3 on develop. Decrement 4.2.0 -> 4.1.3.
I'm using GitVersion on Azure Pipelines.
The default configuration for the develop branch is increment: Minor. If you add a GitVersion.yml file in your repository (and I think you need to tell AzDO about its location; a peculiarity with the AzDO task, not inherent in GitVersion) and configure develop to increment: Patch I think you'll get the behaviour you want. Example:
branches:
develop:
increment: Patch
Related
I have solution with following projects:
Api
Application
Infrastructure
Tests
Api is WebApplication (entry point) and has ProjectReferences to libraries Application and Infrastructure.
Tests is a xunit test project and has ProjectReferences to Api / Application / Infrastructure.
I want consistent package versions both during publishing main (Api) project and during running tests.
I added following properties to Api.csproj:
<RestorePackagesWithLockFile>true</RestorePackagesWithLockFile>
<RestoreLockedMode Condition="'$(CI)' == 'true'">true</RestoreLockedMode>
And it generated Api/packages.lock.json - and it seems that this file also tracks versions of dependencies of referenced projects.
Here is how I publish application (Api):
RUN dotnet restore ./Api/Api.csproj
RUN dotnet publish Api -c Release -o out --runtime alpine-x64 --self-contained true /p:PublishTrimmed=true
So if CI=true env var is set, then commands above should either restore packages according to package.lock.json or fail.
However before publishing Api I run tests like this:
dotnet test ./Tests/Tests.csproj
My question is how to ensure that exactly same package versions will be used suring testing as in Api/package.lock.json? Because if I add <RestorePackagesWithLockFile>true</RestorePackagesWithLockFile> to Tests project then it will have separate Tests/package.lock.json file which may not be same as the one in Api/package.lock.json, right? On the other hand when Tests project references Api project then from what I understand Api/package.lock.json is ignored (when running Tests project)?
Is it possible to have one package.lock.json for solution (same for all projects in solution)?
I feel a bit bad for making this an answer and possibly getting rep votes, when mu88 beat me by 12 hours in the comments to the question, but Central Package Management is the answer. There's also a blog post about it.
Currently, neither Visual Studio, nor dotnet add package support installing or upgrading packages, so you will need to hand edit all the xml (csproj, props) files. But support should be coming in VS 2022 17.4, .NET SDK 6.0.400.
In our pipeline for the past 3 weeks, the dotnet test task has not been terminating at all, and we are forced to cancel the pipeline.
We have observed this log, which we had not seen prior to 3 weeks: (No changes have been made in these 3 weeks at all, either to our code or to Windows on our on premise agent, SQL Server or any other service in the agent).
The STDIO streams did not close within 10 seconds of the exit event from process 'C:\AzAgent_work_tool\dotnet\dotnet.exe'. This may indicate a child process inherited the STDIO streams and has not yet exited.
##[error]Error: The process 'C:\AzAgent_work_tool\dotnet\dotnet.exe' failed with exit code 1
All the tests in the task pass though.
Publishing test results to test run '1033120'.
TestResults To Publish 233, Test run id:1033120
Test results publishing 233, remaining: 0. Test run id: 1033120
Published Test Run :
The test results are published.
We have observed this in the logs:
*##[warning].NET 5 has some compatibility issues with older Nuget versions(<=5.7), so if you are using an older Nuget version(and not dotnet cli) to restore, then the dotnet cli commands (e.g. dotnet build) which rely on such restored packages might fail. To mitigate such error, you can either: (1) - Use dotnet cli to restore, (2) - Use Nuget version 5.8 to restore, (3) - Use global.json using an older sdk version(<=3) to build
Info: Azure Pipelines hosted agents have been updated and now contain .Net 5.x SDK/Runtime along with the older .Net Core version which are currently lts. Unless you have locked down a SDK version for your project(s), 5.x SDK might be picked up which might have breaking behavior as compared to previous versions. You can learn more about the breaking changes here: https://learn.microsoft.com/en-us/dotnet/core/tools/ and https://learn.microsoft.com/en-us/dotnet/core/compatibility/ . To learn about more such changes and troubleshoot, refer here: https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/build/dotnet-core-cli?view=azure-devops#troubleshooting*
Hence we added this in our yaml before the test task, as our app is built with .NET Core 3.1:
- task: UseDotNet#2
inputs:
packageType: sdk
version: 3.x
installationPath: $(Agent.ToolsDirectory)/dotnet
displayName: 'Installing .net core sdk 3.x'
condition: succeeded()
I did a little investigation and found this link on the MS github page: https://github.com/microsoft/azure-pipelines-tasks/issues/13033
We have set these variables: TASKLIB_TEST_TOOLRUNNER_EXITDELAY 60000
name: 'NUGET_PLUGIN_REQUEST_TIMEOUT_IN_SECONDS'
value: 30
name: 'NUGET_PLUGIN_HANDSHAKE_TIMEOUT_IN_SECONDS'
value: 30
No effect yet. How can we solve this issue? It is not affecting our release but we would like to have a green pipeline? Tips on this would be greatly helpful
Thanks for Ashwin's answer.
There was an issue with chromedriver exe which used was not updated for chrome 89. Hence one of the tests was hanging. So then after changing the chromedriver and removing the filter, the tests ran fine and the stage terminated. So no changes to pipeline yaml were required.
Posting here so others who have the same issue can find this answer quickly.
I am configuring semantic versioning with GitLab for my dotnet core apps and netstandard 2.0 packages.
After reading quite a bit of opinions, some of them contradictory, this is what is clear to me.
A semantic version should be something like
M.m.P.B-abc123 where
M is major version
m is minor version
P is patch version
B is build version (optional)
-abc123 is suffix (optional) in case I use pre-releases. It must start with letter
So the following package versions would be valid:
1.0.0
1.0.1.20190301123
1.0.1.20190301123-beta
1.0.1-rc1
I have the following gitlab script for my versioning
#Stages
stages:
- ci
- pack
#Global variables
variables:
GITLAB_RUNNER_DOTNET_CORE: mcr.microsoft.com/dotnet/core/sdk:2.2
NUGET_REPOSITORY: $NEXUS_NUGET_REPOSITORY
NUGET_API_KEY: $NEXUS_API_KEY
NUGET_FOLDER_NAME: nupkgs
#Docker image
image: $GITLAB_RUNNER_DOTNET_CORE
#Jobs
ci:
stage: ci
script:
- dotnet restore --no-cache --force
- dotnet build --configuration Release
- dotnet vstest *Tests/bin/Release/**/*Tests.dll
pack-beta-nuget:
stage: pack
script:
- export VERSION_SUFFIX=beta$CI_PIPELINE_ID
- dotnet pack *.sln --configuration Release --output $NUGET_FOLDER_NAME --version-suffix $VERSION_SUFFIX --include-symbols
- dotnet nuget push **/*.nupkg --api-key $NUGET_API_KEY --source $NUGET_REPOSITORY
except:
- master
pack-nuget:
stage: pack
script:
- dotnet restore
- dotnet pack *.sln --configuration Release --output $NUGET_FOLDER_NAME
- dotnet nuget push **/*.nupkg --api-key $NUGET_API_KEY --source $NUGET_REPOSITORY
only:
- master
This generates packages such as:
1.0.0 for master branch (stable or production ready) and 1.0.0-beta1234567 for any other branch.
The problem with my approach is that I have VS solutions with multiple projects, each project will be a nuget package and each one has its own version. Sometimes I modify one project but not the other, therefore in theory I shouldn't need to produce a new artifact of the project that I didn't touch nor a new version, of course.
Right now my nuget repository prevents overwriting packages, so If there is a XXX.YYY 1.0.0 and I generate another XXX.YYY 1.0.0 and push it to the repository, it will throw an error and the pipeline will fail.
I have thought that maybe it's not such a bad idea to generate a new package each time I run the CI/CD pipeline, so I considered introducing the build number and have something like XXX.YYY 1.0.0.12345 and, even if I don't touch anything there, the next time a new package XXX.YYY 1.0.0.123499 would be produced.
Is this a correct approach in a continuous deployment scenario? or should I look for a way to make my script smarter and not to produce a new artifact if there is already one with the same version in my nuget repository?
Assuming it's ok to use build numbers always, how do I make sure that only the build number is retrieved from the pipeline but the M.m.P version numbers remain in my csproj as per the following?
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<Description>Whatever</Description>
<VersionPrefix>1.0.1</VersionPrefix>
</PropertyGroup>
</Project>
I would need something like:
dotnet pack *.sln --configuration Release -p:PackageVersion=$FIXED_VERSION.$CI_PIPELINE_ID --output nupkg
but I don't know how to retrieve the <VersionPrefix> content from the csproj through the CLI.
Any advice, good read or solution for my approach assuming it's valid?
Thanks
The problem with my approach is that I have VS solutions with multiple projects, each project will be a nuget package and each one has its own version. Sometimes I modify one project but not the other, therefore in theory I shouldn't need to produce a new artifact of the project that I didn't touch nor a new version, of course.
Since a continuous integration pipeline is unable to determine if your code should be a new minor or major version you will always have to determine which semantic version your package should get. This makes the whole process a lot easier.
The guys from visual studio team services have this to say about it:
Immutability and unique version numbers
In NuGet, a particular package is identified by its name and version number. Once you publish a package at a particular version, you can never change its contents. But when you’re producing a CI package, you can’t know whether it will be version 1.2.3 or just a step along the way towards 1.2.3. You don’t want to burn the 1.2.3 version number on a package that still needs a few bug fixes.
SemVer to the rescue! In addition to Major.Minor.Patch, SemVer provides for a prerelease label. Prerelease labels are a “-” followed by whatever letters and numbers you want. Version 1.0.0-alpha, 1.0.0-beta, and 1.0.0-foo12345 are all prerelease versions of 1.0.0. Even better, SemVer specifies that when you sort by version number, those prerelease versions fit exactly where you’d expect: 0.99.999 < 1.0.0-alpha < 1.0.0 < 1.0.1-beta.
Xavier’s blog post describes “hijacking” the prerelease tag to use for CI. We don’t think it’s hijacking, though. This is exactly what we do on Visual Studio Team Services to create our CI packages. We’ll overcome the paradox by picking a version number, then producing prereleases of that version. Does it matter that we leave a prerelease tag in the version number? For most use cases, not really. If you’re pinning to a specific version number of a dependent package, there will be no impact, and if you use floating version ranges with project.json, it will mean a small change to the version range you specify.
Source: https://devblogs.microsoft.com/devops/versioning-nuget-packages-cd-1/
As stated first you still need to pick a version yourself for you new package. For the process before the actual publishing of the final package (master branch in your case) you can use the pre release tag to include the build number as the pre release tag.
Without doing anything smart this will publish a new package for every pipeline that is run. The people at visual studio team services and I do not think that this is a bad thing but this will be up to everyone's personal taste
I know it's a little bit late, but maybe others in the future also ask this question.
For background: I work in a small company and we started in the NodeJS world with NPM packages. After a while we also started building with .NET Core and we kind of adapted some things from the JavaScript world. A tool we used heavily is Semantic Release. It automates versioning by parsing commit messages. So you (and everyone developing on the project) use a special format for commits with a prefix type like fix, feat, chore and so on, and Semantic Release parses every commit message since the last release and determines the next version number.
Since in JS world you have to only update the package.json file it kinda works out of the box but for .NET there is a little bit more work to do.
We use the Directory.Build.props file to set the version like you would do with package.json (but it would work in a csproj file too).
Now Semantic Release works with plugins and one plugin we wrote can update files with the version number. So we use it to update the verson number in Directory.Build.Props. But instead of the version in the file you could also directly give the version number to the dotnet nuget cli but we found it more convenient to have it in the file.
So here's the flow we use for our (internal) NuGet packages:
configure Semantic Release to update the version number in Directory.Build.props file (before creating the package)
create the nuget package (we have another plugin for that, but you could just use the exec plugin during the prepare stage of SR
publish the nuget package during the publish stage of SR
The CI script would then just a call to npx semantic-release in an environment that has NPM as well as the .NET SDK available (we use container images for that).
Especially for OP's situation it would be a little bit more complicated, since multiple projects exists in one repository and SR is made to work in a single repository since it works with git tags to determine the last version. So I would advise to split it in multiple repositories.
The benefits are:
automation of release whenever there is a new feature/bugfix
automation of version numbers that adhere to Semantic Versioning (and is unopinionated)
automatically generate a Changelog along with every release
depending on the Git Platform you can also create releases (there are plugins for GitHub, GitLab and others)
you can configure SR to create pre-releases in certain branches, for example you can have an alpha or beta or develop branch
you could also add the commit hash to the NuGet package so everyone using it would know exactly what is bundled in it
Semantic Release creates a commit with the updated package.json (you can configure it to also include the updated Directory.Build.props and I would advise to do so)
And all you have to do is to configure it one time and enforce a special Commit Message style (which one is not so important, you can configure SR accordingly).
I am trying to run a simple GRPC client-server code in raspberri Pi running Raspbian os.
Language that i am using -C# dotnet core (2.1)
I downloaded a sample project from here.
This is a dotnet core project . I am able to run it in Windows environment, i am also able to modify .proto file in this code and run successfully.
I published the solution as it is with command
{ dotnet publish -r linux-arm }
When tried running same on Rpi, i am getting exception. Attached screenshot has the details of it.
Any help to get through this would be of great use
tl;dr The problem is the libgrpc_csharp_ext native library which currently does not get compiled and built for the arm7 processor. I've compiled it (on a pi) for arm7 and released a nuget package to bridge the gap until they support it all the way: https://www.nuget.org/packages/libgrpc_csharp_ext.arm7/
I'll update with a link to a blog post when I finish getting the rest of the tooling and template finished I'm working on.
fuller explanation: the Grpc.Core nuget package contains the native libgrpc_csharp_ext library that the dotnet implementation of grpc loads in NativeExtensions.cs then maps with PInvoke in NativeMethods.Generated.cs. Inspecting that package, you'll see a version of that library in each /runtimes/[win, osx, linux]/native folder. Unfortunately, no linux-arm version of the library is included. However, in the code, if the platform is linux, it will try to load the static library using the name as formatted here. Dissect that a little and you'll see that as of right now, any 'linux' platform that isn't '64bit' (which despite the proc on the pi being 64 bit, the distro of linux you're using on there, including raspbian, likely isn't) will look for libgrpc_csharp_ext.x86.so. When you dotnet publish -r linux-arm, you'll see that library there in the build output, but unfortunately, it's the wrong one (I think publish just grabs 'the closest one' when it can't find a specific library in the runtimes folder).
The nuget package I created above is compiled for arm7 - I actually cloned the grpc repo onto a pi and peeled away enough of the /csharp build to just cmake the libgrpc_csharp_ext. The 'trick' the package uses is to put the library in runtimes/linux-arm/native folder within the package, which dotnet core recognizes when publishing and pulls into the build output - but the library is still named libgrpc_csharp_ext.x86.so because of the way NativeMethods.cs formats the library name.
When I select and run a test, the build fails with the message:
"Kotlin: Usage of '#JvmDefault' is only allowed if the flag -Xenable-jvm-default is enabled" for the following files.
corda/serialization/src/main/kotlin/net/corda/serialization/internal/OrdinalIO.kt
corda/serialization/src/main/kotlin/net/corda/serialization/internal/SerializationFormat.kt
corda/serialization/src/main/kotlin/net/corda/serialization/internal/amqp/AMQPSerializer.kt
I have cloned corda from my fork of corda/corda on github, and am on branch master, opened in IntelliJ as per instructions on the docsite. The JDK version is 1.8.0_152 and the Kotlin plugin is on version 1.2.41. I see that the -Xenable-jvm-default is enabled in the corda/build.gradle file. There are no local changes. Could you please advise on what I missed or need to do to fix this?
This can be fixed by invalidating IntelliJ's caches and restarting IntelliJ. See jetbrains.com/help/rider/Cleaning_System_Cache.html.
Make sure you are using the gradle runner to execute the tests on IntelliJ.
Navigate to Build, Execution, Deployment -> Build Tools -> Gradle -> Runner (or search for runner)
Windows: this is in “Settings”
MacOS: this is in “Preferences”
Set “Delegate IDE build/run actions to gradle” to true
Set “Run test using:” to “Gradle Test Runner”