How many CPU/GPU cores are using Azure Cognitive Services during the chosen training time? - azure-cognitive-services

I can choose training budget in customvision.ai in hours:
And I can see how much it cost in azure price calculator:
But I missing information what resources I will get for this price per hour (how many CPU/GPU cores etc..?). I just want to understood will that training uses only CPU or GPU too, and how much.

Custom Vision in Azure Cognitive Services is a platform service as far as I understand. That means that billing is based on service (API) usage and not by hardware resources (number of CPU/GPU cores etc.). Azure's pricing documentation about this service supports this.
A sample training that I performed shows results like below. CPU/GPU usage information for that training iteration is not available on customvision.ai or Azure portal.

It's uses GPU for the Advance Training.

Related

What is Search Unit in Azure Cognitive search?

What is search unit in Azure cognitive search?
Need more details about internal process Search Unit.
What is the advantage of more than one search unit in Azure cognitive search?
Azure Cognitive Search allows you to add redundancy and partitioning to your service.
With redundancy, the data in your index exists in multiple copies. The main advantage of that is failure tolerance.
With partitioning, the data is split into shards. The main advantage of that is performance as requests can be routed and handled with more parallelism.
Search units are the product of partitions and redundancy, pretty much how many virtual machines are needed to achieve the specified numbers.

Azure Data Explorer vs Azure Synapse Analytics (a.k.a SQL DW)

I design a data management strategy for a big IoT company. Our use case is fairly typical, we ingest large quantities of data, analyze them, and produce datasets that customers can query to learn about the insights they need.
I am looking at both Azure Data Explorer and the Data Warehouse side of Azure Synapse Analytics (a.k.a Azure SQL Data Warehouse) and find many commonalities. Yes, they use different languages and a different query engine on the backend, but both serve as a "serving layer" that customers use to query read-only data at a large scale.
I could not find any clear guidance from Microsoft about how to choose between the two, or maybe it makes sense to use them together? In that case, what is the best use case or type of data for each of the services?
If you can enlighten me please share your thoughts here. If you know about some guidance about the matter please reply with a link.
The classic and also the modern data warehouse pattern involve first designing a well curated data model, with documented entities and their attributes, creating a scheduled ETL pipeline that transforms and aggregates the raw data, big and small into the data model. Then you load and serve it. The curated data model provides stability, consistency and reliability when consuming these entities across an enterprise.
Azure Data Explorer was designed as an analytical data platform for telemetry. In this workload you do not aggregate the data first, but actually keep it close to the raw format as you do not want to lose data. It allows you to deal with the unexpected nature of security attacks, malfunctions, competitive behaviors, and in general the unknowns, as it allows looking at the fresh raw data from different angles and provide a lot flexibility.
This is why Azure Data Explorer is the storage for Microsoft Telemetry and also a growing set of analytical solutions like: Azure Monitor, Azure Security Center, Azure Sentinel, Azure Time Series Insights, IoT Central, PlayFab gaming analytics, Windows Intune Analytics, Customer Insights, Teams Education analytics and more.
Providing high performance analytics on raw data, with schema-on-read capability on textual, semi structured and structured data.
Quite a few of our partners and customers are adopting ADX for the same reasons.
Check out the overview webinar that describe these concepts in detail.
Azure Synapse Analytics packed SQL DW, ADF and Spark to have all the data warehouse pattern components highly integrated and easier to work with and manage. As we announced on the Azure Data Explorer Virtual Event, Azure Data Explorer is being integrated to Azure Synapse Analytics along side the SQL and Spark pools to cater for telemetry workloads - Real time analytics on high velocity, high volume, high variety data.
Check out some of the IoT cases Buhler, Daimler video,story, Bosch, AGL and there are more leading IoT platforms who are adopting Azure Data Explorer for this purpose. Reach out to us if you need additional help.

How to handle daily calculations in a web application (Google Firebase/NestJS)

I am building a web app with the following stack:
UI - React
Backend framework - NestJS
Infrastructure - Google Firestore document DB, services deployed in Heroku
I need to calculate finance portfolio metrics on a daily basis for all users and display them when the user logs in. I am in a bit of a dilemma what approach to take and I have several ideas, so I hope you can give me some guidance.
Scheduled microservice
I can build and schedule a microservice in Python (the finance framework is in Python) that will run every day and calculate the needed metrics for the users and update the database. Seems straightforward but it might consume a lot of compute resources, especially when the user base grows large.
Cloud Functions
Google Firestore supports cloud functions that can trigger on specific events. I can leverage that and run the calculation microservice when the data is requested - that way I will calculate the information only on-demand. The downside is that if the data has not been requested for a long time, I will have to calculate the metrics for a larger period of time and this might take a while.
P.S. Just saw that there are also scheduled cloud functions - possible implementation might check if the data is calculated today (user has logged in at least once) and if not, calculate it.
I will be happy to discuss any other options that might be available.

Request for higher concurrency for Speech-to-text

I am a developer at Across Cultures - we provide online EAL (English as an Additional Language) support for learners in schools.
I've been looking at your Speech Services API and have something working for our requirements, however we will need support for more than 20 concurrent connections to the API - currently we are experiencing as much as 100+ concurrent users.
Can you tell me if it is possible to increase the concurrent connections, how it affects price, and if it can auto-scale or do we need to specify the number in advance?
Thanks,
Simon
The faq page: https://learn.microsoft.com/en-us/azure/cognitive-services/speech-service/faq-stt
tells you about the information we need to increase concurrency. Please don't post this information publicly (especially subscription keys etc).
There is no additional cost to increase concurrency. The concurrency defines the upper limit. The service scales dynamically up to that limit.
thx
Wolfgang

Limit to number of requests to Microsoft Face API

Does anyone know if the "Out of call volume quota" is exclusively for free trial user and if we subscribe to the monthly plan, there will be no limit to the number of calls to Microsoft Face API?
I would also like to know since the API can take 10 requests per second from a paid key, does that mean by requesting with different processes simultaneously, the total process time can be shortened?
Thank you
From the pricing page the limit for the free trial is 30,000 API calls a month and that's removed on the paid tier. The standard paid tier has a max throughput of 10 transactions per second, for example that could come for example from multiple apps all submitting calls at the same time. If you need higher volume please reach out to the team via the contact us link at the bottom of the page on www.microsoft.com/cognitive
After reading Ryan's answer I tried to determine from Microsoft how much it would cost to increase the transaction limit.
As of today, Microsoft Inside Sales confirmed that it is not possible.
We are stymied and so are considering other Cloud services that have a higher or no rate limit, such as Kairos (https://www.kairos.com/pricing)

Resources