I am trying to connect Azure Data explorer plugin data source in grafana but we don’t have cluster setup for my azure application and I can’t provide cluster url in that plugin. I tried simple JSON plugin but it requires a http url and our Azure portal url is https.
Do we have any alternatives plugins or any other way to access Azure cosmos db data SQL API into grafana?
Thanks for your help!
Regards,
Sreenivasa
Sreeni. There is a Grafana use case for Azure and for Azure Cosmos DB but only through the Azure Monitor plug-in to view timeseries data of your Azure service. Grafana cannot be used as a client to the Azure SQL API, if I understand your question correctly.
Please see: Deploying Grafana for production deployments on Azure
Related
I have a project where my end users are using Alteryx to combine data sources and import into SQL server. We are investigating changing SQL server to CosmoDB but I'm not sure if connecting to CosmoDB is supported by Alteryx.
Do anyone know if there is a connector? Am I overthinking this? I don't know Alteryx very well.
There is currently no dedicated connector from Azure Cosmos DB.
However, the Azure Cosmos DB ODBC driver enables you to connect to Azure Cosmos DB using BI analytics tools.
Also, there is an existing REST API for Cosmos DB, it should be possible to access it through a Downloaded tool. You can suggest in the idea section of Alteryx Community to request one.
I created Azure CosmosDb database and container for my documents.
I use NiFi as a main data ingestion tool and want to feed my container with documents from NiFi flow files.
Can anybody please share a way to post flowfile content to Azure Cosmos Db from NiFi?
Thanks in advance
UPDATE(2019.05.26):
In the end I used Python script and called it from NiFi to post messages. I passed a message as a parameter. The reason I chose python is because it has some examples on official Microsoft site with all the required connection settings and libraries, so it was easy to connect to Cosmos.
I tried Mongo component, but couldn't connect to Azure (security config didn't work), didn't really go too far with it as Python script worked just fine.
Azure CosmosDB exposes MongoDB API so you can use the following MongoDB processors which are available in NiFi to read/query/write to & from Azure CosmosDB using Apache NiFi.
DeleteMongo
GetMongo
PutMongo
PutMongoRecord
RunMonogAggregation
Useful Links
https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb-introduction
https://learn.microsoft.com/en-us/azure/cosmos-db/mongodb-feature-support
Valeria. According to the components list supported by Apache Nifi related to Azure, you could only get Azure Blob Storage, Queue Storage, Event Hub etc,not including Cosmos DB.
So,I suggest you using PutAzureBlobStorage to feed azure blob container with documents from NiFi flow files. Then please create a copy activity pipeline in Azure Data Factory to transfer data from Azure Blob Storage into Azure Cosmos DB.
I have an Azure Web App hosting an API (ASP.NET MVC project) that interacts with a CosmosDB database and collections to get subscriptions and other information.
The CosmosDB database is accessed R/W by the Web App middle-ware uses through the nuget package "Microsoft.Azure.DocumentDB" SDK v1.19.1.
I am trying to set up the CosmosDB IP Firewall through the Azure Portal. I allowed the Azure Portal to have access to the db and then I needed to also allow the web app (also hosted on Azure) to have access. To do this, I copied the Virtual IP Address of the Web App from the Properties tab in the Azure Portal.
But this was not enough. I waited more than 10 minutes trying my web app but all the calls to the CosmosDB were rejected with error 404, which as the documentation states it is the proper behavior for SDK Calls (security reasons).
Then I added, all the Outbound IP Addresses stated at the same Properties Tab of the Web App. Waited for more than 20 mins and still 404 error.
What are the correct steps to achieve the requested task?
For example in SQL On Azure, the IP Filtering allowed for an option, to allow access from any Azure App/ VM / Service. How can we achieve the equivalent in CosmosDB?
Thanks in advance
Since Azure App Service is PaaS, and following this article, please try adding the IP 0.0.0.0.
On the Azure Portal, this can also be set by switching on Allow access to Azure Services.
I am trying to test run a basic .NET web application on pivotal cloud foundry. This web application uses as its database a MongoDB server hosted on my local machine. At the moment I am limited to use of the cloud infrastructure by using just the Apps Manager.
I have read the pivotal cloud foundry docs about user provided services, but cannot figure out as to how the connection is to be really made. I have already come across various other ways like using MongoDB as a service (beta version), but at the moment I am not allowed access to the Operations Manager. Looking for an explanation on user provided services or how to implement the service broker API, specifically.
I am new to Mongo as well, so any suggestion regarding making a connection through tweaking Mongo may help as well. Thanks
The use case you describe (web app in PCF connecting to a resource in your local machine) is not recommended.
You can create a MongoDB instance for development purposes in PCF.
$ cf marketplace
...
mlab sandbox Fully managed MongoDB-as-a-Service
...
You can create a mlab service and bind it to your application. You will then have a MongoDB instance in PCF that you can use for development purposes.
Edit:
In that case a user provided service might help you, where you pass in your remote MongoDB instance configuration that you can read in your application. e.g.:
cf.exe cups my-mongodb -p '{"key1":"value1","key2":"value2"}'
You can add your local mongo-db as a CUPS service to your PCF Dev.
Check out the following post.
How to create a CUPS service for mongoDB?
Is there an issue getting a standard wordpress site - hosted on google compute engine, to connect to a google SQL database?
You need to authenticate from Compute Engine to Cloud SQL. You can set the service_account_scope= to the compute engine instance at the time of creation. You can refer to the following article for more details about the service account scopes for other Google Cloud services https://developers.google.com/compute/docs/authentication.