Implement gRPC service to replace .NET Framework WCF service - grpc

First of all, is this even possible? I know gRPC mainly works in .NET Core and I'm having issues implementing it in .NET Framework.
So far I've installed the necessary NuGet packages (Grpc.Core, GrpcCore.Api, Grpc.Net.Client, Grpc.Net.Common, and Grpc.Tools) and tried using the tutorial from https://grpc.io/docs/languages/csharp/basics/ as a starting point.
Right now the main issue I'm running into is that my .proto file doesn't have the Build Action -> Protobuf Compiler in it's properties and is not generating the code it's supposed to when compiled.
Are there any better examples out there for me to reference?

Related

Is there a way to have .NET Framework code in a .NET Core library?

We have a commercial library that I am working to port to .NET Core. There are a couple of calls in it I want to retain to use only if running in .NET standard. (For the curious, one set is to read a file on a Windows server that requires credentials to access.)
Is there:
A call that will tell me if I am running under .NET Standard vs. .NET Core.
Is there a way to have a class that is only going to be instantiated/called if running under standard, but the DLL will still load fine under Core?
Also asked on MSDN
Since what you describe, having a single nuget package and being able to specify different behaviours or dependencies depending on the framework the nuget package is installed into, can only be reached through Multi Targeting I will assume you are doing that or will be doing it.
Once you have specified target frameworks, you have pre-defined variables to use in precompile blocks:
#if NETFRAMEWORK
// use full framework class here. You were installed into a full framework app or library
#elif NETCOREAPP
// use .NET Core class here. You were installed into a .NET Core app or library
#else NETSTANDARD
// uh... okay... you were installed into another .NET Standard library,
// we still have no idea where *that* might be installed... help?
// Maybe make it configurable after all?
#endif
.NET Standard is not a runtime, it is a set of APIs that a runtime must implement in order to be compatible. So basically this allows people to have a library target .NET Standard and have one code-base that will run in all supported runtimes because it is guaranteed that those runtimes will have an implementation for those APIs.
.NET Standard doesn't have implementation at all, it just defines a set contract of APIs which is used at compile time, but at runtime the APIs used will be the ones in the runtime the consumer decided to target their application for.
A better runtime detection would be to use RuntimeInformation.FrameworkDescriptor APIs. We do that for our framework tests to know what we're running our tests on: https://github.com/dotnet/runtime/blob/master/src/libraries/Common/tests/CoreFx.Private.TestUtilities/System/PlatformDetection.cs#L21
You could also achieve this via reflection by doing something like: typeof(string).Assembly... if the assembly is System.Private.CoreLib you're on .NET Core, if it is mscorlib, you're in .NET Framework.

Project not compatible with netcoreapp2.0

I'm trying to add a full framework class library as a project reference to asp.net core 2.0 MVC project and getting the below error.
Project XYZ is not compatible with netcoreapp2.0 (.NETCoreApp,Version=v2.0).
Project XYZ supports: net462 (.NETFramework,Version=v4.6.2)
I have updated to the most recent version of Visual studio i.e, 15.3.5.
Is it even possible to reference 4.6.2 libraries in core 2.0 projects?
The first thing that you can try is to compile the library you want to consume as netstandard2.0.
Theoretically (according to the .net standard documentation), this will make it compatible with projects using net461 and later as well as netcoreapp2.0 and later.
In practice, sometimes you will end up with a problem with one of your dependencies that don't provide the same library version across different compilation targets.
In such cases you may simply need to add the .net core 2.0 as a target framework for the XYZ library.
The xml tag listing the targets is <TargetFrameworks> in the XYZ.csproj file and is not handled by the Gui of the project's properties.
So I would give a try at editing the XYZ.csproj by hand and add or replace what's listed as <TargetFrameworks> with netcoreapp2.0.
If you are adding it as additional target you need to separate them with ';' as in
<TargetFrameworks>net462;netstandard2.0;netcoreapp2.0</TargetFrameworks>
More details about this in this Microsoft doc.
Please keep in mind that this will trigger multiple compilations and will slow your build consequently...
It should be. Microsoft announced a ".NET Framework Compatibility Mode" with the release of .NET Standard 2.0. However, they didn't go into great detail about how it works exactly, or what to troubleshoot if it doesn't. Additionally, they only specific talk about it in relationship to Nuget packages, so it's possible there's some role Nuget is playing in the process, as well. Unfortunately, I've been unable to find any additional information about this feature outside of the announcement post.
That said, Microsoft's explicit recommendation is to not rely on the fact that your .NET Framework library may just happen to work in .NET Core; instead, you should be actively porting .NET Framework libraries you control to .NET Standard. I'd say you're likely going to spend more time trying to figure out why it doesn't "just work" than you would porting your code, so that it will definitely work, and be future-proof to boot.
The following solution worked for me.
Deleted bin and obj folders from all the projects in the solution, rebuild and if it still doesn't work try changing browser from debug options. for eg. If you already have chrome as default browser in Visual studio, switch to Edge or Firefox.

Compatibility shim used by .NET Standard 2.0

Overviews (example) of .NET Standard 2.0 say that it now uses some kind of compatibility shim that fixes the third-party library compatibility issue. So you can use the third-party library with .NET Standard until it doesn't use any API which .NET Standard doesn’t have.
What is not clear is
how does this shim work? any drawbacks?
and
how to check that third-party library is supported? By directly adding it into the project and then trying to compile?
This works by creating all the necessary libraries that are referenced by classic .NET libraries.
E.g. in .NET Core the implementation of Object or Attribute is defined in System.Runtime. When you compile code, the generated code always references the assembly and the type => [System.Runtime]System.Object. Classic .NET projects however reference System.Object from mscorlib. When trying to use a classic .NET assembly on .NET Core 1.0/1.1, this usually leads to types not being found. In .NET Core 2.0, there will be "fake" types in a mscorlib that the runtime knows how to forward to where the implementation actually is.
You can read more about how this assembly unification works on the dotnet/standard GitHub repo but the most important scenario is this (image taken from this repository):
This shows how the scenario is supposed to work: When a 3rd party dll references [mscorlib]Microsoft.Win32.RegistryKey, there will be an mscorlib.dll that contains a type forward to [Microsoft.Win32.Registry] Microsoft.Win32.RegistryKey so it will work when a Microsoft.Win32.RegistryKey.dll is present.
This also shows the major downside: The registry is a windows-only concept and not available on Mac or Linux so this particular code may fail to run on non-windows platforms. But if you use only parts of the library that do not use this functionality, it may work for cross-platform scenarios.
Another problem is that even if API is "available" to compile against and reference, it still may throw a PlatformNotSupportedException.
For example, a library that implements a file format for serialisation / deserialisation might work without modification, even if it has been built for .NET Framework 3.5.
To find what API functions a particular library uses, the .NET Portability Analyzer can be used to scan a dll and show if the library is compatible and if not, which APIs are blocking.

How to Add Reference to System.Data.Services.Client in .Net 5 Project

I am trying to add search to an Asp.Net 5 project. The search uses the Bing Search API.
As per the instructions in the "Bing Search API – Quick Start and Code Samples" I have downloaded a file called "BingSearchContainer.cs". This file has references to System.Data.Services.Client. The file is too big to put here but can be downloaded at https://datamarket.azure.com/dataset/explore/getproxy/5ba839f1-12ce-4cce-bf57-a49d98d29a44.
I added references to System.Data.Services and System.Data.Services.Client as they were not included in the generic Asp.Net 5 (RC1) template I have used (in Visual Studio 2015) to create the site.
Although this removes the errors in the files themselves, the errors are still present in the error list and the project won't build or run.
If I hover over the using statement for System.Data.Services.Client at the top of the BingSearchContainer file it says ....DNX Core5.- Not Available.
Does anyone know how I can solve this?
You need to be aware of the platforms you're targeting. .NET Core is a new runtime, and there are no built-in libraries. Everything must be added (generally as a NuGet package), even things that were previously available from the Standard Libraries.
Check and see if the library you want is available on NuGet. If not, you'll need to find some sort of workaround or stop targeting .NET Core and just focus on the full .NET Framework.
Some workarounds
Locate a different package that does what you want and is available for both .NET Core and the full .NET Framework
Use System.Data.Services.Client on full .NET Framework and an alternative framework for .NET Core, and use compiler directives to target specific blocks of code at specific versions of the framework
Location the source for System.Data.Services.Client and try porting it to .NET Core. You should probably double check with Microsoft about this to see if they have plans to move it over already, as well as to see if there's anyone else that might help you with it
Just compile your project for .NET Framework, and don't compile for .NET Core

Differences between .Net Full framework and the .Net Core Framework 4.5 used by K runtime?

I've seen videos introducing ASP.NET vNext and been keeping up with the recent announcement blog posts, but detailed information on what's been stripped from the full framework appears slim. Here's what I think I know so far:
It's much smaller (11MB vs >200MB): http://davidzych.com/2014/05/24/getting-started-with-asp-net-vnext/
Strong naming is gone: http://jeremydmiller.com/2014/06/09/final-thoughts-on-nuget/
It's dumped System.Web
It includes a merged MVC and WebAPI (however I don't believe this is part of the framework itself but rather dependencies that can be specified)
Dependencies are completely managed through project.json, to the extent that the base
Are we basically looking at a framework that basically includes nothing more than what's in mscorlib in the full framework, with all else delivered via package management? And if this is the case, why would one need to target the framework specifically, as described here? http://blogs.msdn.com/b/webdev/archive/2014/06/17/dependency-injection-in-asp-net-vnext.aspx
The reason they specifically target NET45 in the link you supplied is because AutoFac is built for and has a dependency on .NET 4.5. Without NET45 the code wouldn't compile.
My assumption is that once vNext gets closer and closer to release the Autofac (and StructureMap, and Castle Windsor, and ...) will release a version that targets the cloud optimized framework to remove the dependency.
As far as I understand, .Net Framework is the fully framework we know and love with all the Windows implementations and lots of code we don't normally use, like they explain in some videos an XML parser.
In .NET Core they removed all the unneeded implementations/dependecies and only left the basic ones. which also enables cross platform (not yet), so in the future one could think as the only framework : CORE Framework, and run on any device. Their february community standup give a lots of information and insight on their objectives and goals.
I see this as a transition, when some features are available only on the full Framework while in the futures one might expect to see all features available for .NET Core.
From a Microsoft perspective, if they want to release lets say Entity Framework for mobile (EF7 is aiming at that) they must get rid of all the windows implementations, on EF and it's dependencies (Framework). So they created a non-windows dependency on the framework, which also helps the multiple framework install and remove some problems with updating the framework by having them mostly isolated from the system, lying in the application. New problems will come like multiple copies of the same framework on one machine per application, that's why they are working on something called Smart Sharing.
This post may help you and give you some insight specially this part :
The structure of .NET Core is comprised of two major components which
add to and extend the capabilities of the .NET Framework as follows:
Runtime:
Built on the same codebase as the .Net Framework CLR. Includes the
same GC and JIT (RyuJIT) Does not include features like Application
Domains or Code Access Security. The runtime is delivered on NuGet
(Microsoft.CoreCLR package)
Base class libraries:
Are the same code as the .Net Framework class libraries but do not
contain dependencies so have a smaller footprint. Available on NuGet
(System.* package)
and I guess you already read Introducing .NET Core from Microsoft.
Regarding your concern about specifying a specific framework is because right now, not everything works on Core CLR so you must choose which one to use, or you can target both and use different implementations.
As of right now, CORE only runs on Windows; the mono framework doesn't have a SQLLite provider for entity framework but it does on Core, so you can use an InMemory or Azure EF provider for example, and choose depending on the enviroment your application is running.
As Scott Gu says on the community standup, they envision a future where there's no mono framework or full framework, there's just Core, but that will take time if it ever happens.
I can't find an original source other than a comment by David Fowler (I believe) on a presentation from NDC, but CoreCLR used by the K Runtime is actually a reincarnation of the CLR used by Silverlight 2. It was used because it's small and designed to be cross platform. There is some additional information here: https://stackoverflow.com/a/25720160/113225

Resources