I am going to migrate our service to use Azure.Search.Documents (v11) SDK instead of raw HttpClient GET/POST to query documents from an azure search.
Per the SDK document, we need to initialize a SearchClient with a service endpoint and an index name. As our service is a multi-tenancy service which share multiple customers' indexes in the same search service (could be 3000 indexes on S3 HD service), so in theory, we need up to 3000 search client instances.
My question is, is it worth implementing a search client pool to reuse the client for the same index for better performance? Or can I just create a new search client each time when sending a request to azure search? Not sure if the SDK handles the client pool internally.
Please take a look at the best practices guide for working with client objects in Azure SDK. Reusing client instances (wherever possible) will result in better performing applications.
Related
We have several app service APIs which will read from same Cosmos DB. We want each API use different connection string for security reason. But from Azure portal, seems we can only have 2 master connection strings and 2 read-only connection strings.
Can I generate more read-only connection strings?
That is not directly supported, but there are other ways to achieve what you're looking for.
Essentially, you are now using the 'Master Keys' approach to define your connection string. You are building it by specifying the account and one of the 2 master keys to access your data.
The only other approach available is the 'Resource Token' approach. In this architecture you'd have a middleware tier that is configured with a secure connection to your Cosmos account using one of the master keys, and all the applications (that need to be individually secured) call on to this layer for an access token.
In this approach you will have to manage authentication of your clients to your middleware so it's definitely more involved, but ultimately more secure, especially if your APIs are deployed outside of your environment.
See the official docs page for all the details about Resource Tokens.
API Keys in web side context are never secured, this can be a big problem if API services are paid for. How to solve this problem in Azure Cognitive Service context ?
I assume you are referring to not exposing the keys to the end user who is browsing to your website, as opposed to securing the keys on the webserver itself from another admin on the server.
This isn't really a Cognitive Services question, but a question generic to any secrets you want to keep when hosting a website (or creating a mobile app, or really any app that uses some sort of key or password).
The short answer is, don't give the key to the client, which means that the client can't directly make the call to Cognitive Services, and you have to have code running on your web server that makes the call.
Generally you would do one of two things:
Code running on your web server would make the call to Cognitive and then process and display relevant results to the user via the webpage.
Your web server would expose an API, and then you would have client side script call your API. Your API internally would call the Cognitive Services API and return the data where the client side script would process and display the results.
You can also find similar info at How to protect an API Key when using JavaScript?, or a web search for something like 'web development protecting api keys javascript'.
I'm designing a database monitoring application. Basically, the database will be hosted in the cloud and record-level access to it will be provided via custom written clients for Windows, iOS, Android etc. The basic scenario can be implemented via web services (ASP.NET WebAPI). For example, the client will make a GET request to the web service to fetch an entry. However, one of the requirements is that the client should automatically refresh UI, in case another user (using a different instance of the client) updates the same record AND the auto-refresh needs to happen under a second of record being updated - so that info is always up-to-date.
Polling could be an option but the active clients could number in hundreds of thousands, so I'm looking for a more robust and lightweight (on server) solution. I'm versed in .NET and C++/Windows and I could roll-out a complete solution in C++/Windows using IO Completion Ports but feel like that would be an overkill and require too much development time. Looked into ASP.NET WebAPI but not being able to send out notifications is its limitation. Are there any frameworks/technologies in Windows ecosystem that can address this scenario and scale easily as well? Any good options outside windows ecosystem e.g. node.js?
You did not specify a database that can be used so if you are able to use MSSQL Server, you may want to lookup SQL Dependency feature. IF configured and used correctly, you will be notified if there are any changes in the database.
Pair this with SignalR or any real-time front-end framework of your choice and you'll have real-time updates as you described.
One catch though is that SQL Dependency only tells you that something changed. Whatever it was, you are responsible to track which record it is. That adds an extra layer of difficulty but is much better than polling.
You may want to search through the sqldependency tag here at SO to go from here to where you want your app to be.
My first thought was to have webservice call that "stays alive" or the html5 protocol called WebSockets. You can maintain lots of connections but hundreds of thousands seems too large. Therefore the webservice needs to have a way to contact the clients with stateless connections. So build a webservice in the client that the webservices server can communicate with. This may be an issue due to firewall issues.
If firewalls are not an issue then you may not need a webservice in the client. You can instead implement a server socket on the client.
For mobile clients, if implementing a server socket is not a possibility then use push notifications. Perhaps look at https://stackoverflow.com/a/6676586/4350148 for a similar issue.
Finally you may want to consider a content delivery network.
One last point is that hopefully you don't need to contact all 100000 users within 1 second. I am assuming that with so many users you have quite a few servers.
Take a look at Maximum concurrent Socket.IO connections regarding the max number of open websocket connections;
Also consider whether your estimate of on the order of 100000 of simultaneous users is accurate.
There is an application that should read user tweets for every registered user, process them and store data for future usage.
It can reach Twitter 2 ways: either REST API (poll twitter every x mins), or use its Streaming API to get tweets delivered.
Besides completely different implementations on server side I wonder what are other impacts on server side?
Say application has tousands of users. Is it better to build kind of queue and poll twitter for each user (the simplest scenario), or is it better to use Streaming API and keep HTTP connection open for each user? I'm a bit worried about the latter as it'd require keeping tousands of connections open all the time. Are there any drawbacks of that I'm not aware of? If I'd like to deploy my app on Heroku or on EC2 instance, would it be ok or are there any limits?
How it is done in other apps that constantly need getting data for each user?
Is it efficient using a web service to access database objects?
I'm developing a win phone app and a web app. Both of them will use the same db. Should I create one web service for two apps?
A shared webservice is definitely the right way to go. That's really the point of a service, to be able to access the same business and data logic from multiple places (assuming both places are doing the same thing of course). It also acts as a natural security buffer between your app and database - so your database only needs to accept connections from the service, as opposed to multiple client applications.
As far as the technology, since both of your clients are Microsoft, you can use WCF as your service as opposed to a traditional SOAP service. Or you can go with something more universally accepted, like WebAPI with JSON. Lots of options there.