Azure ASP.NET REST API and Database deployment - asp.net

We started our planning phase on a new project and we settled on a ASP.NET REST API which should be hosted on Azure. Since none of us has any experience on deployment on Azure (or any other cloud service), I have two questions.
Do you need separate Azure Services for the Database and the API, or might there be a combined "package" for the prototype, which later can be changed easily?
Is there any documentation or are there any examples of the entire deployment process of a simple dummy API and the DB? I have spent the last few hours reading the official documentation and searching around, but I would really love to see some sort of reference, just to ensure I don't miss something.
For now, the best I have found is this and this. This seems rather shallow, so I really hope, that there might be more.

If you're looking for in-depth design an implementation details then I would suggest that the Azure Architecture Center would be an excellent place to start, for hands on experience there are hundreds of free courses available on Microsoft Learn.
Specifically there are sections on API design and API implementation. From the Serverless web application page is:
If you don't need all of the functionality provided by API Management, another option is to use Functions Proxies. This feature of Azure Functions lets you define a single API surface for multiple function apps, by creating routes to back-end functions. Function proxies can also perform limited transformations on the HTTP request and response. However, they don't provide the same rich policy-based capabilities of API Management.
Function Proxies
I would suggest starting with using Azure Functions for your API (you only pay for the number of calls + a combination of CPU, memory, and runtime, but the first 1,000,000 calls per month are free (consumption plan), rather than paying for an Azure App Service to host your API and run all the time but only be utilized some of the time.
Some links that might help:
Build Serverless APIs with Azure Functions
Customize an HTTP endpoint in Azure Functions
There is an excellent summary in this article that states:
For the heavy workloads.
Private(enterprise) API - API Management with a Premium plan.
Public API - Functions Proxy with the Premium plan.
For light/moderate workloads.
Private API -Functions Proxy with the Premium plan.
Public API -Functions Proxy with a Consumption plan and custom warm-up solution.
Then from here you can use a connection string to an Azure SQL DB inside your functions to write to the DB or something like Azure Managed Identity (yes the link is for Azure PostgreSQL but the process will be much the same for Azure SQL).
In terms of deployment you should be looking at using Azure DevOps (or GitHub Actions):
Setting up a CI/CD pipeline for Azure functions (old way - GUI pipelines)
Deploy an Azure Functions app to Azure (new way - YAML pipelines)
Continuous Delivery for Azure SQL DB using Azure DevOps
Another helpful tool to get a gauge of costs is the Azure Pricing Calculator.

Related

AWS CloudWatch with mobile applications

I have a backend system built in AWS and I'm utilizing CloudWatch in all of the services for logging and monitoring. I really like the ability to send structured JSON logs into CloudWatch that are consistent and provide a lot of context around the log message. Querying the logs and getting to the root of an issue is simple or just exploring the health of the environment - makes CloudWatch a must have for my backend.
Now I'm working on the frontend side of things, mobile applications using Xamarin.Forms. I know AWS has Amplify but I really wanted to stick with Xamarin.Forms as that's a skill set I've already got and I'm comfortable with. Since Amplify didn't support Xamarin.Forms I've been stuck looking at other options for logging - one of them being Microsoft's AppCenter.
If I go the AppCenter route I'll end up having to build out a mapping of the AppCenter installation identifier and my users between the AWS environment and the AppCenter environment. Before I start down that path I wanted to ask a couple questions around best practice and security of an alternative approach.
I'm considering using the AWS SDK for .Net, creating an IAM Role with a Policy that allows for X-Ray and CloudWatch PUT operations on a specific log group and then assigning it to an IAM User. I can issue access keys for the user and embed them in my apps config files. This would let me send log data right into CloudWatch from the mobile apps using something like NLog.
I noticed with AppCenter I have to provide a client secret to the app, which wouldn't be any different than providing an IAM User access key to my app for pushing into CloudWatch. I'm typically a little shy about issuing access keys from AWS but as long as the Policy is tight I can't think of any negative side-effects... other than someone flooding me with log data should they pull the key out of the app data.
An alternative route I'm exploring is instead of embedding the access keys in my config files - I could request them from my API services and hold it in-memory. Only downside to that is when the user doesn't have internet connectivity logging might be a pain (will need to look at how NLog handles sinks that aren't currently available - queueing and flushing)
Is there anything else I'm not considering or is this approach a feasible solution with minimal risk?

.Net Core querying records from different microservices

I'm learning how to design and implement microservices using serverless technologies. I'm trying to create autonomous microservices and am having difficulty understanding how to communicate data across/between microservices. I'm using .Net Core for my microservices and am wanting each microservice to be a AWS lambda function exposed via API Gateway.
I have the following scenario:
Institution microservice - returns a list of institutions within a radius (25 miles) of a zipcode.
ROI Calculator microservice - receives a zip code as input and calls institution microservice receiving a list of institutions. For each institution returned, perform a series of calculations yielding a ROI value.
How should ROI Calculator microservice make a call to institution microservice?
ASP.NET core web api application can be published as it is on AWS Lambda as Serverless function. You get everything that regular .NET core application provides like controllers , models etc. Amazon API gateway proxy is integrated directly into .NET Core api routing system. So your AWS lambda function will be serving your .Net core web api. You should watch this tutorial for starters to get better understanding.
Create .NET Core AWS lambda function
.NET core AWS Lambda Microservices
If you go by template provided by AWS SDK (ASP.NET core web api template) and you publish .Net core web api on AWS it will configure everything for you including AWS Lambda function and API gateway. So if you have create 2 .net core web api projects you will have 2 web api gateways. The problem is if we have 10 microservices mean we will have 10 api gateways , so we should ideally have 1 api gateway for multiple microservices.
I have worked on POC recently that has one API gateway and all microservices AWS lambda functions are behind this. Each microservice has base path e.g. shopping or users setup in their startup.cs that will identify them individually behind apigateway. so microservice 1 will be apigateway/shopping/{anything} , another microservice will be apigateway/users/{anything} and they both are configured behind api gateway. API Gateway will send request to AWS lambda function (.Net core web api) and this request will be resolved by .Net core routing system. Even multiple controller can be used this way in a single web api project without problem.
I have modified serverless.template so we can only publish aws lambda function and configure apigateway seperatley. You can find code sample and details on my github blog here .NET Core Web API AWS Lambda Microservices Sample .
There are two ways of doing this depending on your independance of the microservices is probably the best answer:
Make a internal HTTP call from the ROI -> Institution which would be okay. The problem with this is that if the service is down the data will not be available.
Store the data needed to make the calculation inside the ROI service as well. This seems strange but the data once created in say the Institution service it could be sent via a message bus to the ROI service which then uses the data when needed. (this however may not suit your domain, it depends what information it needs).
However it seems that the calculation and the storage of the Institutions could be within the same microservice therefore eleminating the problem.

Is there a Firebase API endpoint to fetch account usage?

I'm building a SAAS project on Firebase and weighing the pros and cons of multi-tenant architecture vs managed Firebase instances for each account.
One of the key challenges with managed instances is reporting usage. I've searched through Firebase docs but have not been able to find an API endpoint to fetch a Firebase project's account usage (eg bandwidth, function invocations, storage, etc).
Is there a API that I'm missing or is this alternatively possible with core Google Cloud APIs?
firebaser here
We just added a REST API to manage projects and the apps in those projects. But there is currently no way to report usage for a project across all Firebase products. It sounds like a useful feature through, so I recommend filing a feature request.
Until a feature is added that fits your needs, you will have to do the tracking from within your own app, or by proxying the tracked functionality through Cloud Functions, where you can then log whatever you need for tracking usage.

using micro-service architecture through fire base cloud functions

What is the scope of implementing a micro-service architecture using firebase cloud functions? Is it a correct way to do it or is it a step backward. As we have seen fire base is built to be server less application back-end, But with multiple triggers and support for HTTPS should we try to get back to micro services. Just to try I have implemented multiple services on firebase cloud functions which had multiple URLs, they had a really good response time averaging at 500ms
This is a very challenging question to answer. It is not a step backward, you can think of Cloud Functions as a tool that you can use along with other technologies to implement your microservice strategy. For instance, if you are going to be leveraging the Firebase Database, and other features within Firebase then it makes sense to use the Cloud Functions for Firebase.
Let's say you don't want to use Cloud Functions for Firebase and you choose another technology such as Kubernetes or App Engine. First, you'll have to add the firebase SDKs to that stack and make sure it can access your Firebase project. You get access for free in Cloud Functions for Firebase. Next, you will write the same code that you would implement in the Cloud Function. Finally, you will have additional steps for deploying those technologies. Leveraging Cloud Functions for Firebase will be quicker and more productive.
As time goes on it will become more apparent when to use an additional technology. I recently wrote a blog post about when I would choose Container Engine over Cloud Functions. This topic can become subjective since it's really based on your needs, features, and the technologies you are working with.
Cloud Functions vs Container Engine

Service Bus architecture for ASP.NET Web API

I am developing a mobile application using Telerik Platform. The services consumed by the app are ASP.NET Web API RESTful services which are hosted on Azure. I'd like to build some resilience into the app by adding a service bus and have been looking at Azure Service Bus which seems to be what I'm looking for.
It's fairly new to me and I have a few questions.
Can Azure Service Bus be used for RESTful services that return data or are they fire-and-forget only?
For simple RESTful services is Azure Service Bus the way to go or Azure Storage Queue? When would you use one vs the other?
When would I use a Queue vs Topic / Subscription?
ASB is about messaging. You use messaging for communication between parts of your system/services. RESTful services can leverage ASB by translating a request into a message to perform some work. Emphasis on converting the intent into a message, to instruct about work that needs to take place, not execute the work itself.
ASB or ASQ is your choice. This is where you need to choose between the features and capabilities each provides. There's a good MSFT comparison documentation on it.
Queues vs Topics/Subscriptions - if you need to send a message to a single destination (a command) then queue is simpler. In case a message needs to be broadcasted to multiple receivers (events), topics/subscriptions are your friends.

Resources