With the release of .NET Standard 2.0, it's advised to target .NET Standard 2.0, even if you target 1.x already.
https://learn.microsoft.com/en-us/dotnet/standard/net-standard:
However, targeting lower .NET Standard versions introduces a number of support dependencies. If your project targets .NET Standard 1.x, we recommend that you also target .NET Standard 2.0. This simplifies the dependency graph for users of your library that run on .NET Standard 2.0 compatible frameworks, and it reduces the number of packages they need to download.
Now another big change is near! .NET Core 3 and I see that Microsoft also is targeting .NET Core 3 for Microsoft packages.
For example, Microsoft.Extensions.Logging is targeting .NET Standard 2.0 and also .NET Core 3 (.NETCoreApp 3.0):
I compared the XML files and both APIs look the same (maybe not the best way to compared them)
Now the question ;)
As a library maintainer that depends on Microsoft.Extensions.Logging, who trying to support .NET Core 3:
Should I also target .NET Core 3 - or is just .NET Standard 2.0 good enough if I don't need specific stuff of .NET Core 3?
Short answer
You don't have to target .NET Core 3 if you don't want to use anything from it, or if you don't want to offer any .NET Core 3 optimizations. On the other hand, double targeting doesn't cost you anything and may allow you to get rid of library references that are now built into .NET Core 3. At the very least you may be able to get rid of some library references that now come with the runtime.
Long answer
It depends entirely on what you're doing, what you want to do. A library doesn't have to target .NET Core 3.0 just because its dependencies include it in their targets.
For example, the source code shows that Microsoft.Extensions.Logging doesn't seem to have any C# 8/.NET Core 3.0 specific code. It targets 3.0 because it's part of that wave of extensions, so double-targeting doesn't require any modifications.
On the other hand, Config.Json doesn't have to reference System.Text.Json and System.Threading.Tasks.Extensions because they are part of the runtime.
<ItemGroup Condition="'$(TargetFramework)' == 'netstandard2.0'">
<Reference Include="System.Text.Json" />
<Reference Include="System.Threading.Tasks.Extensions" />
</ItemGroup>
Other benefits
For maintainers, .NET Core 3.0/.NET Standard 2.1 offer a lot of sanity preserving features like:
Nullable Reference Types. You'll avoid a lot of NREs in your own code. You'll probably catch a lot of hidden bugs too.
Default interface members. You won't have to worry about breaking users' code when you add a new member to a public interface.
IAsyncEnumerable. No more waiting for all results from a bunch of asynchronous operations
The switch expression and far more powerfull pattern matching and deconstruction syntax.
For some of those features you can add only a few methods that will be available only for .NET Core. For example, the ChannelReader class adds a single ReadAllAsync() method in a partial file that reads items from a channel and returns an IAsyncEnumerable<>, eg :
public virtual async IAsyncEnumerable<T> ReadAllAsync([EnumeratorCancellation] CancellationToken cancellationToken = default)
{
while (await WaitToReadAsync(cancellationToken).ConfigureAwait(false))
{
while (TryRead(out T item))
{
yield return item;
}
}
}
This is a small but very convenient addition. It allows you to receive messages with :
await foreach(var msg from reader.ReadAllAsync())
{
....
}
NRTs on the other hand will help even for .NET Standard 2.0 because they help you catch nullability bugs in the source code when compiling for .NET Core 3.0.
As a library maintainer that dependents on
Microsoft.Extensions.Logging, who trying to support .NET Core 3:
Should I also target .NET Core 3 - or is just .NET Standard 2.0 good
enough if I don't need specific stuff of .NET Core 3?
Targeting .NET Standard 2.0 in your library is good enough for as long as all of your dependencies target .NET Standard 2.0 as well, including Microsoft.Extensions.Logging.
As Panagiotis Kanavos said, there can be benefits of targeting .NET Core 3.0 for the consumers of your library, so if that's the case and it doesn't cost you too much, then by all means target .NET Core 3.0 in addition to .NET Standard 2.0.
As karann said, nuget will always select the best matching assets for every package in the graph. i.e.
App A uses your library and targets .NET Core 3.0. Your library targets .NET Standard 2.0 only. NuGet will use that, and it's fine.
App A uses your library and targets .NET Core 3.0. Your library targets both .NET Standard 2.0 and .NET Core 3.0. NuGet will choose .NET Core 3.0
When someone installs a package, NuGet uses the assets from TFM that best matches the TFM of the project. It does this for the transitive dependencies too.
For example - if the project target netcore30 and package A has assets under lib/netcore30 and lib/netstandard20, nuget will select lib/netcore30. Let's say package A depends on B, and package B has assets for netstandard20, net472, nuget will select netstandard20.
Bottom line is, nuget will select the best matching assets for every package in the graph. So, as a library maintainer, you don't need to add two TFMs to support netcore30. You can target netstandard21 which implies support for netcore30 based on this doc https://learn.microsoft.com/en-us/dotnet/standard/net-standard
Related
I have a .net Core 2.2 project. This is created using a Webapplication (Model View Controller) template. I can add my .Net Framework 4.7.1 projects into this core project, it compiles, run - and is deployed on my test servers.
1) Then I read about 2.2 End of Life, and I tried to migrate this to 3.1, and I cannot reference .Net Framework 4.7.1 in 3.1 framework. I don't know what is my next step here.
2)I read that I can convert my dll's to .Net Standard and reference - but, how can I do this?
3)These 4.7.1 dll's are shared by .Net Framework projects and core projects, so if I change this to .Net Standard - will my .Net Framework applications work?
4) Also - should I migrate my 2.2 Core projects to 3.0 because of the EOL? Is that mandatory? How will EOL affect audits if I don't migrate?
First, 2.2 is EOL, because 2.1 is the LTS release. You can downgrade to 2.1 if you don't want to jump to 3.x yet, and you'll still have a year or two of support there, I think.
However, 3.x takes the first step towards the new vision of one .NET (.NET 5 for all workflows), so the sooner you can get there, the better. 3.1, specifically, is the LTS release for 3.x, so stick there if you don't want to be forced to upgrade again for a while.
.NET Core 3.x implements .NET Standard 2.1, which is why you can no longer target .NET Framework with that (no version of .NET Framework implements .NET Standard 2.1 and never will). However, .NET Standard 2.0 is supported by both .NET Core (2.x and 3.x) and .NET Framework 4.6.1+. As a result, if you need to share a library between all these targets, you should target .NET Standard 2.0.
As far as converting your existing libraries go, you simply change the target framework to .NET Standard 2.0. That's literally it. Once you do that, some functionality in the library may fail (anything that requires .NET Framework, i.e. Windows-specific APIs). At that point, you either need to rewrite those parts of the library to use .NET Standard-compatible APIs, or use compiler directives to sub-in alternate implementations for .NET Standard 2.0/.NET Core, at which point, you'd have to multi-target the library (i.e. .NET Framework and .NET Standard 2.0 or even specifically .NET Core). When compiling, DLLs will be generated for each specific target, allowing you to seamless reference the same library from projects targeting any of the library targets.
If you're doing anything with ASP.NET Core components in your libraries, you should factor that code out into separate libraries and target .NET Core 3.1 directly there. There's no point in targeting .NET Standard 2.1, as that code will only ever be applicable to .NET Core, anyways. You should also work in the opposite direction. In other words, if there's anything that's only applicable to .NET Framework projects (Web Forms, etc.), then factor that out into separate libraries that will only target .NET Framework. That will allow you to migrate the remaining parts of the library more easily to .NET Standard 2.0.
We have a commercial library that I am working to port to .NET Core. There are a couple of calls in it I want to retain to use only if running in .NET standard. (For the curious, one set is to read a file on a Windows server that requires credentials to access.)
Is there:
A call that will tell me if I am running under .NET Standard vs. .NET Core.
Is there a way to have a class that is only going to be instantiated/called if running under standard, but the DLL will still load fine under Core?
Also asked on MSDN
Since what you describe, having a single nuget package and being able to specify different behaviours or dependencies depending on the framework the nuget package is installed into, can only be reached through Multi Targeting I will assume you are doing that or will be doing it.
Once you have specified target frameworks, you have pre-defined variables to use in precompile blocks:
#if NETFRAMEWORK
// use full framework class here. You were installed into a full framework app or library
#elif NETCOREAPP
// use .NET Core class here. You were installed into a .NET Core app or library
#else NETSTANDARD
// uh... okay... you were installed into another .NET Standard library,
// we still have no idea where *that* might be installed... help?
// Maybe make it configurable after all?
#endif
.NET Standard is not a runtime, it is a set of APIs that a runtime must implement in order to be compatible. So basically this allows people to have a library target .NET Standard and have one code-base that will run in all supported runtimes because it is guaranteed that those runtimes will have an implementation for those APIs.
.NET Standard doesn't have implementation at all, it just defines a set contract of APIs which is used at compile time, but at runtime the APIs used will be the ones in the runtime the consumer decided to target their application for.
A better runtime detection would be to use RuntimeInformation.FrameworkDescriptor APIs. We do that for our framework tests to know what we're running our tests on: https://github.com/dotnet/runtime/blob/master/src/libraries/Common/tests/CoreFx.Private.TestUtilities/System/PlatformDetection.cs#L21
You could also achieve this via reflection by doing something like: typeof(string).Assembly... if the assembly is System.Private.CoreLib you're on .NET Core, if it is mscorlib, you're in .NET Framework.
Is there a way to easily convert a class library targeting .NET Core 2.0 to .NET Standard?
If I understand it correctly, if one wants to maximize the reusability of class libraries in projects targeting different .NET frameworks e.g. .NET Framework, .NET Core, Xamarin, etc., it's a better idea to target .NET Standard -- provided that all the required APIs are available in the version of .NET Standard that will be targeted.
This is the reason why I want to convert my class libraries from .NET Core 2.0 to .NET Standard 1.6 or .NET Standard 2.0.
In the project file, you can point target compilation to netstandard with the exact version.
Example of Proj.csproj:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard1.6</TargetFramework>
</PropertyGroup>
</Project>
...
Microsoft provides good documentation about targeting types.
Dotnet Standard is not a framework or a library, it is an abstract set of instructions: what functionality should have System.Array, String, List, and so on. Currently, there are different implementations: .NET Framework, .NET Core, Mono, Xamarin, Windows Phone. It means that different implementations can recompile and reuse your library targeting netstandard. It is a very good choice for a NuGet package.
You can play with the versions and find the minimum function set required for your library. Each Dotnet Standard extends the functionality of the previous version. Thus, the less the targeted version is selected the more platforms your library will support.
You can edit the csproj file (can be done inside VS by right-clicking on the project file) and change
<TargetFramework>netcoreapp2.0</TargetFramework>
to
<TargetFramework>netstandard2.0</TargetFramework>
Go to Project Folder
Open .csproj file
Replace your Target Framwork
netcoreapp2.2 TO netstandard2.0
See this picture to more clear
My five cents on this topic. I had to convert libraries already written in .Net Core 3.1 and 5.0. When you open a solution or add the project to your current .Net Standard solution it won't show the list of .Net Standard options.
The reason is you have 2 references to update in order to make it work. First, change the target framework in the .csproj file. For example, to switch to .Net Standard 2.1 I modified the file in the following way:
<PropertyGroup>
<TargetFramework>netstandard2.1</TargetFramework>
</PropertyGroup>
Second, change the project type reference in the Solution (.sln file) opening it with any editor or it won't recognize it either. The GUID for .Net Standard 2.1 is 9A19103F-16F7-4668-BE54-9A1E7A4F7556. If you have a different version of standard the trick is adding a new project, check the GUID for the project type and replace it in the solution file.
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = ...
EndProject
Since the major version number has changed I suppose that netstandard2.0 contains changes that are incompatible with netstandard1.*.
Are there any restrictions in using netstandard1.* libraries from the netstandard2.0 project?
You can use any netstandard1.* library with netstandard2.0 project.
From .NET Standard versioning rules:
Additive: .NET Standard versions are logically concentric circles: higher versions incorporate all APIs from previous versions. There are no breaking changes between versions.
and specific for .NET Standart 2. breaking change clarification:
Based on community feedback, we decided not to make .NET Standard 2.0 be a breaking change from 1.x. Instead, .NET Standard 2.0 is a strict superset of .NET Standard 1.6. The plan for handling .NET Framework 4.6.1 and .NET Standard 2.0 is outlined in the spec.
In general the .NET Standard version of your project impact the following:
The higher the version, the more APIs are available to you.
The lower the version, the more platforms implement it.
I've seen videos introducing ASP.NET vNext and been keeping up with the recent announcement blog posts, but detailed information on what's been stripped from the full framework appears slim. Here's what I think I know so far:
It's much smaller (11MB vs >200MB): http://davidzych.com/2014/05/24/getting-started-with-asp-net-vnext/
Strong naming is gone: http://jeremydmiller.com/2014/06/09/final-thoughts-on-nuget/
It's dumped System.Web
It includes a merged MVC and WebAPI (however I don't believe this is part of the framework itself but rather dependencies that can be specified)
Dependencies are completely managed through project.json, to the extent that the base
Are we basically looking at a framework that basically includes nothing more than what's in mscorlib in the full framework, with all else delivered via package management? And if this is the case, why would one need to target the framework specifically, as described here? http://blogs.msdn.com/b/webdev/archive/2014/06/17/dependency-injection-in-asp-net-vnext.aspx
The reason they specifically target NET45 in the link you supplied is because AutoFac is built for and has a dependency on .NET 4.5. Without NET45 the code wouldn't compile.
My assumption is that once vNext gets closer and closer to release the Autofac (and StructureMap, and Castle Windsor, and ...) will release a version that targets the cloud optimized framework to remove the dependency.
As far as I understand, .Net Framework is the fully framework we know and love with all the Windows implementations and lots of code we don't normally use, like they explain in some videos an XML parser.
In .NET Core they removed all the unneeded implementations/dependecies and only left the basic ones. which also enables cross platform (not yet), so in the future one could think as the only framework : CORE Framework, and run on any device. Their february community standup give a lots of information and insight on their objectives and goals.
I see this as a transition, when some features are available only on the full Framework while in the futures one might expect to see all features available for .NET Core.
From a Microsoft perspective, if they want to release lets say Entity Framework for mobile (EF7 is aiming at that) they must get rid of all the windows implementations, on EF and it's dependencies (Framework). So they created a non-windows dependency on the framework, which also helps the multiple framework install and remove some problems with updating the framework by having them mostly isolated from the system, lying in the application. New problems will come like multiple copies of the same framework on one machine per application, that's why they are working on something called Smart Sharing.
This post may help you and give you some insight specially this part :
The structure of .NET Core is comprised of two major components which
add to and extend the capabilities of the .NET Framework as follows:
Runtime:
Built on the same codebase as the .Net Framework CLR. Includes the
same GC and JIT (RyuJIT) Does not include features like Application
Domains or Code Access Security. The runtime is delivered on NuGet
(Microsoft.CoreCLR package)
Base class libraries:
Are the same code as the .Net Framework class libraries but do not
contain dependencies so have a smaller footprint. Available on NuGet
(System.* package)
and I guess you already read Introducing .NET Core from Microsoft.
Regarding your concern about specifying a specific framework is because right now, not everything works on Core CLR so you must choose which one to use, or you can target both and use different implementations.
As of right now, CORE only runs on Windows; the mono framework doesn't have a SQLLite provider for entity framework but it does on Core, so you can use an InMemory or Azure EF provider for example, and choose depending on the enviroment your application is running.
As Scott Gu says on the community standup, they envision a future where there's no mono framework or full framework, there's just Core, but that will take time if it ever happens.
I can't find an original source other than a comment by David Fowler (I believe) on a presentation from NDC, but CoreCLR used by the K Runtime is actually a reincarnation of the CLR used by Silverlight 2. It was used because it's small and designed to be cross platform. There is some additional information here: https://stackoverflow.com/a/25720160/113225