Deploy Azure Face API for IoT Edge - microsoft-cognitive

Is it possible to deploy Azure Face API trained model to IoT Edge like Custom Vision?
If it is, please answer me how to do that?

Updating this topic...
Now you can download a Docker Image with the Face API for running it on-premises.
Here you can find the documentation for testing this feature, that currently is in public preview.
Here you can see the list of all the Azure Cognitive Services that are available as Docker Containers.
This new feature basically is targeting enterprises that:
Are not willing or able to load all their data into the cloud for processing or storage;
Are subject to regulatory requirements on handling customer data;
Have data that they aren’t comfortable sharing and processing in a cloud, regardless of security;
Have weak bandwidth or disconnected environments with high latency and TPS issues.

Model export is not a feature supported by the Face API.

Related

How to get Azure Analysis Service Size across subscription level

We have 60+ azure analysis services in our subscription so how can we get size of azure analysis size? So we want to automate this and publish in front end reports where users can see information.
It is difficult to get size of each cube in each azure analysis service by logging to each azure analysis service using SSMS.
Going with azure metrics memory option is also not right and accurate option.
Following below blog, but it is not allowed me run script in powershell ISE, got some error.
How we get analysis services database size in azure analysis services Tabular model
Is there any option to get all azure analysis services size using single script or any REST API ?
Thanks for your help.
Regards,
Brahma

How does Firebase and MongoDB Atlas Synchronise Time?

Does anyone know how the following services take their time references from. In other words: with what source do they sync their time reference?
Firebase
MongoDB Atlas
Found out that AWS services sync their time with a service called Amazon Time Sync.
Amazon Time Sync is used by EC2 instances and other AWS services. It
uses a fleet of redundant satellite-connected and atomic clocks in
each Region to deliver time derived from these highly accurate
reference clocks. This service is provided at no additional charge.
Likewise I need information about Firebase and MongoDB Atlas specifically. Any help/source is appreciated.
What I found by myself.
AWS Services - AWS services sync time with Amazon Time Sync. It uses a fleet of redundant satellite-connected and atomic clocks in each Region to deliver time derived from these highly accurate reference clocks.
Google Services - Google services including Firebase use Google Public NTP. This is a free, global time service that you can use to synchronize to Google's atomic clocks.
MongoDB Atlas - MongoDB Atlas has been enhanced by a move to a global logical clock. Implemented as a hybrid logical clock and protected by encryption.
Since all the services use highly accurate time services, we can assume that they represent the exact time so that time is synchronized across all the services in one application.
References
https://aws.amazon.com/about-aw
https://developers.google.com/time/faq
https://www.mongodb.com/blog/post/transactions-background-part-4-the-global-logical-clock

LUIS URL for container

I am trying to use the LUIS container and am getting conflicting information. I have a test application that uses the LUIS cloud service, using the Speech SDK. The Speech SDK assists in sending the audio stream to the cloud and getting the LUIS intents. And detecting intents is actually stated functionality of the Speech SDK. See the docs at this link.
https://learn.microsoft.com/en-us/azure/cognitive-services/speech-service/quickstarts/intent-recognition?pivots=programming-language-csharp
Currently, I am trying to move to the LUIS container using the same code base with the Speech SDK. However, when I try to connect to the to local LUIS container, (using SpeechConfig.FromEndpoint instead of SpeechConfig.FromSubscription) I get a connection error.
The conflicting information I am getting is whether the Speech SDK can support the LUIS container calls, or if I have to run the Speech-to-text container locally as well and broker the inputs and outputs of those containers in my code.
There is not much documentation on this scenario. Has anyone done this? Can you point me to any docs that describe this?
Sorry for the delayed response. The issue is that the Speech SDK (IntentRecognizer) does not work with the LUIS container, it only works with the LUIS cloud service. When using containers, you have to use both the LUIS and Speech-to-text (in my case) containers and both of the SDKs. And actually, the LUIS prediction requests from the LUIS SDK to the container do not work either. So, you have to build the HTTP requests manually and use the REST API in the LUIS container. I have it basically working at this point. So, this issue can be closed.

Is Azure Cloud Service Worker role the only Azure hosting option for running an EventHub EventProcessor?

I'm currently fighting my way through Event Hubs and EventProcessorHost. All guidance I found so far suggests running an EventProcessor in an Azure Cloud Service worker role. Since those are very slow to deploy and update I was wondering if there is any Azure service that lets me run an EventProcessor in a more agile environment?
So far my rough architecture looks like this
Device > IoT Hub > Stream Analytics Job > Event Hub > [MyEventProcessor] > SignalR > Clients...
Or maybe there is another way of getting from Steam Analytics to fire SignalR messages?
Any recommendations are highly appreciated.
Thanks, Philipp
You may use Azure Web App service with the SignalR enabled and merge your pipeline "steps" [MyEventProcessor] and SignalR into one step.
I have done that a few times, started from the simple SignalR chat demo and added the Event Hub receiver functionality to the SignalR processing. That article is close to what i mean in terms of approach.
You may take a look at Azure WebJobs as well. Basically, it can work as a background service doing your logic. WebJobs SDK has the support of Event Hub.
You can run an EventProcessorHost in any Azure thing that will run arbitrary C# code and will keep running. The options for where you should run it end up depending on how much you want to spend and what you need. So Azure Container Service may be the new fancy deployment system, but it's minimum cost may not be suitable for you. I'm running my binaries that read data from EventHubs on normal Azure Virtual Machines with our deployment system in charge of managing them.
If your front end processes using SignalR to talk to clients have a process that stays around for a while, you could just make each one of those their own logical consumer (consumer group) and have them consume the entire stream. Or even if they don't stay around (ie you're using an Azure hosting option that turns off the process when idle) you could write your receiver to just start at the end of stream (as opposed to reprocessing older data), if that's what your scenario requires.

Which Azure services are PaaS?

(note: To address a criticism I've deleted a prior question and reposted this more succinct version)
I'm trying to compare AWS and Azure for a custom web app that's essentially like any canned content management system. It requires web hosting, database, email, storage, security, some way to process ASP.NET but with high availability and load balanced.
The PaaS/IaaS distinction can sometimes be grey (in part because companies tend to use marketing jargon that portrays IaaS type services as maintenance free). From a small business perspective its quite clear though. If a service involves the SMB spend time maintaining rather than developing, its in the IaaS camp. Since I'm a single developer with limited time, a PaaS model for all services would be preferable. The ideal would be all services (web hosting, database, email, etc are offered as a zero maintenance scalable service rather than have to spin up and manage individual instances.
I find AWS can do everything but a drawback is that one still needs to manage instances (i.e. I would need to keep the software on instances updated, track instances, manage network, security, etc.) S3 doesn't process scripts. AWS Beanstalk and Optworks are still essentially mostly helper apps for starting up an IaaS type environment. (whereas say DynamoDB would count as a PaaS type service). Recently Microsoft has dropped prices on Azure which makes it an attractive alternative
In short, I am looking for a list of services offered by Azure which are actually no maintenance services that don't require I patch software or need to spin up instances to handle traffic spikes (e.g. web hosting, script processing, database, email, etc..)
Pretty much every Azure service besides "Virtual Machines" is PAAS. Meaning, it is fully managed by the platform and users only end up configuring startup and runtime behavior upfront.
This includes but is not limited to:
SQL Azure, Azure Storage, Media Services, Cloud Services, Websites, Cache, Service Bus, Identity, CDN, etc.
Azure was built with PAAS first mindset and IAAS only came later.
In addition check out this page that explains the 3 flavors of virtual machines you can get in Windows Azure (web site, cloud service and virtual machine). It gives you a good understanding on their differences and which one to pick depending on the level of control you need.
Actually in Windows Azure PaaS are: Web Role and Worker Role - because they provide out of box integration with IDE, framework for development of background jobs and web applications.

Resources