Anyone has any idea how to implement distributed tracing for gRPC into application insights. I am aware that application insights does not support grpc tracing for now, and I have tried using opentelemtry to log these tracing into app insights, but it's working correctly. Can anyone provide example or guidance for this?
Related
We have a .net core (3.1) API connected to Application Insights with AddApplicationInsightsTelemetry in Startup's ConfigureServices, as documented here.
We also have a web job in the same API. The logging there does not work without an extra configuration. For it to work, we connected again to Application Insights with .AddApplicationInsightsWebJobs in Program.cs, as documented here.
However, if the configuration for the web job is applied, we stop receiving API's logs of type "request" and "dependency". Without those also other Applications Insights features aren't available, like the application map.
Are we doing anything wrong or missing anything for having Application Insights working for both the API and the web job?
I have a project which is an ASP.NET CORE REST API and the UI is hosted using Http.sys
IIS doesn't come anywhere in picture here. There are multiple backend services created as windows service which include many API calls.
I am looking for a memory profiler tool to monitor the performance. Any suggestions will be helpful. I have tried using .NET memory profiler but due to absence of IIS, things are difficult to manage in it.
Can anyone please suggest me a better tool?
Have you tried MiniProfiler
https://miniprofiler.com/dotnet/AspDotNetCore
MiniProfiler https://www.youtube.com/watch?v=7ThPz-9XM54
NewRelic also has a profiler (if you are using for development purpose only)
https://newrelic.com/lp/dotnet-monitoring?utm_campaign=Supported-Languages-IND&utm_medium=cpc&utm_source=google&utm_content=SLNT_LP&fiscal_year=FY21&quarter=Q4>m=DEV&program=OBSV&ad_type=TXT&geo=APJ&utm_term=%2B.net%20%2Bprofiler&utm_device=c&_bt=501298407481&_bm=b&_bn=g&gclid=CjwKCAjwtJ2FBhAuEiwAIKu19lvqygIsvrOLzUxuZmnd9IcaFRnX5xaMKVEGes5rnV5vhOTWbjb06RoCW0sQAvD_BwE
There is one from JetBrains as well (dotTrace)
https://www.jetbrains.com/help/profiler/Profile_.NET_Core_Application.html
Currently we use ASP.NET core project for our UI. There is need to add an OData service to it. As far as I understand it is not supported well enough yet, so it was decided to implement it in a separate Web.Api 2 project.
Is there someone who has such an experience?
Is it a way to go?
May I encounter troubles with deployment to Azure?
Any ideas and thoughts would be appreciated.
If you hosted the web site on a different web application, then you have to enable CORS on the web API, and you have to test it during development to avoid any issues.
You can overcome this issue by deploying both the UI and API to the same Azure Web App. you can have the UI under the root and the API under a virtual directory, ex: apis
I have to export data from a database in the form of flat files. I already have an asp.net website which saves data into the db. I am thinking of creating a WCF webservice project as part of the website solution. This WCF webservice will have methods to export flatfiles. I am also planning to create a console app to call this webservice at scheduled times.
I have the following questions:
Once the website is hosted on IIS along with WCF, can the console app call the WCF or WCF has to be hosted separately?
How to debug the process?
is there any better way of doing it?
Once the website is hosted on IIS along with WCF, can the console app
call the WCF or WCF has to be hosted separately?
The console app can call the WCF web service. It does not have to be hosted separately.
How to debug the process?
Ideally, on your own PC. Easy way to do is to launch the WCF Webservice within one instance of Visual Studio and the Console App on another instance of VS. You can put break points on each of the projects and follow the logic
is there any better way of doing it?
There are many ways to do one thing but yours in this case looks good to me.
Your plan sounds fine to me.
If I were doing this I might create a WCF Service with One Way Operations so that the client app is not waiting for a response until the job is complete.
I might use Powershell and a scheduled task to hit the WCF service, or even use the free Pingdom service to hit the service endpoint at intervals.
To debug locally- if the WCF is it's own project make sure it's set as the startup project in VS, then apply break points, run debug and request the endpoint through the browser or Fiddler.
I'm writing components of a .Net 4.0 web solution (on IIS7, WS2008), and need to provide a service which can consume messages from a message queue. I've found setup examples for configuring WAS service activation using MSMQ... but we aren't using MSMQ (using RabbitMQ) and I'm pretty sure I'll have to implement some kind of listener of my own.
I guess my problem is the system of configuration settings I'll have to set up is pretty opaque, and documentation is not clear.
so A: how do I implement and configure a custom listener for WAS service activation.
and B: any advice for configuring the rest of this setup would be wonderful.
Thanks
better to consume as a windows service whih is a pain. We are using an opensource project called TopShelf hosted by google. Rubbish documentation but it has a feature where is will auto run all DLL files placed in a directory as windows services - make depoyment and upgrade easy.
In the WCF WF Samples there's source for a UDP Activator. Wish there was one for AMQP.