Microsoft Dynamics 365 and ASP.NET Core Web API - .net-core

I am working on an ASP.NET Core 5 Web API project, and I have some actions that need approval from a specific user to be implemented.
For example: the user wants to update the price of a product, but this action needs approval from his manager to be completed.
I was trying to find a workflow engine that works well with .NET and since the organization that I'm working for already has Dynamics 365 and it has its own workflow engine, I'm trying to use it with the project to handle actions that need a workflow.
Any help or suggestions?
Thank you

This seems like a good fit for flow approvals.
In short, create a table in dynamics to track your request and trigger a flow approval off the create of the record.
See: https://learn.microsoft.com/en-us/power-automate/modern-approvals
The example shows SharePoint being used to trigger an approval, but that can simply be replaced by dataverse (the engine the d365 is running on)
Edit : this vid has a walk though of its use https://m.youtube.com/watch?v=VHiDP13U_HM

Related

Querying Microsoft devops azure commits to display on an "update" page

I'm reworking our companies management website and would like to display any commits or updates on our devops azure page on an "update" page for admins to view.
What would be the easiest way to do this?
Thanks
You can also check out Azure devops Rest api to retrieve the git commits for a project.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits?api-version=5.1
You probably need write codes to extract and display the required properties(eg. commiter, author, url, etc.) of
commits from the results.
You can check the examples in this document to learn how to make azure devops service rest api calls.
You can also check out the .NET client library. For git Commits you can refer to GitHttpClientBase.GetCommitsAsync Method.
Hope above helps!

Microsoft Dynamics Integration with Magnolia

We need Magnolia 5.5 integration with Microsoft Dynamics(CRM) but as per the following magnolia documentation the integration is not available out of the box with Magnolia. Magnolia Documentation
How can we build that functionality in Magnolia, please advice.
Thanks in advance..
All depends on what kind of API Dynamics exposes for such integration. And what features of Dynamics exactly do you want to integrate.
If there is any REST based API, you can have a look at similar integrations (those for SugarCRM, Eloqua or SalesForce come to my mind) and do what needs to be done. Source code for the above integrations provided by Magnolia is AFAIK available to all Enterprise customers.
Typically you have two parts of the integrations
- backend one where you create content connector for an app and the app directly to allow your editors to interact and select items from Dynamics, and
- some templating functions that allow templates to understand items previously selected by the editors and retrieve those from Dynamics when rendering the template.
- typically you will also have to deal somehow with authentication between Magnolia and Dynamics and (unless all is super fast) with caching items retrieved from Dynamics in Magnolia in some form of volatile cache.
But really, any more details on what to do and how depends on the use case. It would be different for building customer self service portal and for e.g. just listing phone number of the sales/support rep closest to visitor of the site based on geolocation.

Populate Azure Mobile Apps backend data

Hi and thanks in advance.
I'm experimenting with a Xamarin.Forms app to handle various events organized by my company (one a month), and I'm taking a cue from the great app made by Xamarin ​​for Evolve16.
But I don't understand how I can populate the database created in Azure backend with code first. I know there is a Seed method to eventually do this the first time but next? Every month I need to add new data (for example new sessions) quickly, and I do not want to provide this functionality within the app because the app should only provide contents to users. All domain obejects inherit from EntityData so I don't know if I can use LINQPad or SSMS to insert data directly because there are fields populated automatically (Version, CreatedAt, UpdatedAt...). Can I use rest api of backend table controllers? But where is the "try out" possibility that there was in Azure Mobile Service??
There are several ways to answer this. Basically, you want to alter the underlying tables in the SQL database
1) aka "the simple version" - download the SQL Server Management Studio and do raw inserts into the table
2) aka "the separate website" - write an ASP.NET web app that uses Entity Framework to do the inserts for you. Make sure you include the Azure Mobile Apps Server SDK and make your models inherit from EntityData.
3) aka "the combined website" - download the Node project that underlies the service and adjust it to add your own website that can do the inserts for you.
The "try out" option is now implemented with swagger. Just go to https://yoursite.azurewebsites.net/swagger

Azure Worker Role Deployment for background data

I want my application to be in 2 phases. 1 part will simply fetch data in json format from an API and store it to a SQL database(or maybe a NO-SQL) and the other half(the web part) will read the data and implement customize alerts. So, basically i need to create a worker for the fetch process. But I'm confused between worker role and web role in Azure. Kindly help me what's the best possible way to implement this design?
You can just merge both in the same web role - the part of code running in IIS (the ASP.NET project created when you create a web role from a Visual Studio template) will handle web requests and the part running the "role entry point" will run the fetch process. Unless you absolutely need to scale them separately this will give you a simpler and more manageable solution.
Have you looked at this tutorial? It gives possible use cases and tutorials for both web and worker roles.
http://www.windowsazure.com/en-us/documentation/articles/cloud-services-dotnet-multi-tier-app-storage-1-overview/

Integration of Microsoft Fast Search and Tridion 2011 SP1

We are upgrading to Tridion 2011 SP1 and as a part of Tridion search implementation we are using FS4SP (Fast Search for sharepoint 2010).
In proposed implemenatation search environement consists of following servers:
FAS4SP
FISE
Can someone guide us regarding how to push content to FAST from tridion and how to retrieve the same?
(Here due to some reasons we are not considering crawling of website by FAST)
What all APIs can be used for this implementation?
If you don't want to use the crawling approach, you will need to create a custom deployer, please take a look at this other article:
How can we integrate Microsoft FAST with SDL Tridion 2011 SP1?
Alternatively, if you don't have a development team who is familiar with Java, you might considering creating a .NET application which updates your FAST index based on either a File System or Database trigger when your pages or components are published, updated or deleted from your broker repository.
You will probably want to create XML for FAST and have the Custom Deployer (or Event System) send the content to FAST.
First create the FAST XML that works and write a sample app so you can insert it into the FAST index from either a .NET or Java application. This does not yet involve Tridion.
Then write your Custom Deployer or Event System and pass the XML to FAST.
IF you are using a Custom Deployer approach I would suggest to contact Tridion Professional Services if you have not done it yourself or are not a Java programmer. The new Tridion 2011 Storage API provides new opportunities for the Custom Deployer. In the meantime I would suggest to append the FAST XML to the normal Page Content at the end, surrounded by some markers, and have your custom deployer pull it out of the Page output, send to FAST, then remove from the output before continuing.
This is a fairly difficult challenge for those who do not have serious Content Delivery / Deployer / Java skills. However, if you want to go for it yourself I would suggest taking at least 2 weeks of time to research existing solutions and experiment with the API.
Using the Event System might be a little easier - but your success or failure message will not appear in the Publish Queue and if the search index fails to update you can only log the failure and not pass the info back to users.

Resources