When to create multiple containers in Azure Cosmos db - azure-cosmosdb

I am creating multiple micro service APIs. Can we use one single Container to store all the different schemas from the different APIs? If so how to differentiate the schemas while retrieving documents? When to create multiple containers in Azure Cosmos dB? is that any disadvantages/caveats of using multiple containers?
Please Explain.

Related

How to copy data from Cosmos DB API for MongoDB to another Cosmos DB account

How to copy data, one collection, from one Cosmos DB API for MongoDB account to another Cosmos DB API for MongoDB account, in another subscription, placed in another Azure region.
Preferably do it periodically.
You can use Azure Data Factory to easily copy a collection from one Cosmos DB API for MongoDB account to another Cosmos DB API for MongoDB account, in any other subscription, placed in any other Azure region simply using Azure Portal.
You need to deploy some required components like Linked Services, Datasets and Pipeline with Copy data activity in order to accomplish this task.
Use Azure Cosmos DB (MongoDB API) Linked Service to connect the Azure Data Factory with your Cosmos DB Mongo API account. Refer Create a linked service to Azure Cosmos DB's API for MongoDB using UI for more details and step to deploy.
Note: You need to deploy two Azure Cosmos DB (MongoDB API) Linked Service, one for source account from where you need to copy the collection, and another for destination account where the data will be copied.
Create Datasets by using Linked service created in above step. Your dataset will connect you to the collection. Again you need to deploy two datasets, one for source collection and another for destination collection. It will look like as shown below.
Now create a pipeline using Copy data activity
In Source and Sink tab in copy data activity settings, select the source dataset and sink dataset respectively which you have created in step 2.
Now just Publish the changes and click on Debug option to run the pipeline once. The pipeline will run and collection will be copied at destination.
If you want to run the pipeline periodically, you can create Trigger based on event or any specific time. Check Create a trigger that runs a pipeline on a schedule for more details.

How to reconfigure a Cosmos database to have shared container throughput

I have a database with several containers which have their own configured throughput. There is a new Cosmos DB feature which allows all containers within a database to share throughput. I can create a new database which has this feature enabled, however I cannot seem to change my existing database to leverage this feature. Is there a way to enable this feature on an existing database, or do I have to create a new database and migrate all containers to it?
You have to create a new database. Changing the existing database is not supported:
A container with provisioned throughput cannot be converted to shared database container. Conversely a shared database container cannot be converted to have a dedicated throughput.
Set throughput on a database and a container

Is a SQL Azure Table similar to an Azure Cosmos DB Container?

I am using the Cosmos DB SQL API. But I am confused how to migrate my existing SQL DB into Azure Cosmos DB.
Is an SQL Table similar to a Cosmos DB Container. Or do we need to store all SQL table data into one container with different Partition keys?
Do not be fooled by the name. The Cosmos DB SQL API does not allow you to work with Cosmos DB as though it were a relational database.
It is fundamentally a JSON document database, for storing items in a container. It is schema-less. Whilst you can import data after a fashion (as #Bob linked), you don't end up with relational tables - it's all JSON documents.
The SQL API allows you to use a SQL like syntax to query the JSON structure - the semantics however, are all based on these hierarchically structured documents, allowing you to return arrays of the JSON documents or projections of them.
Queries always run in the context of a specific container.
You can't JOIN across documents for example - you use JOINs to self-join within individual documents. There is basic aggregation across documents, and some limited grouping functionality.
The semantics are very different from SQL Azure; you need to think differently.
Cosmos DB Data Migration tool can be used to import data from SQL Server into Cosmos DB. Refer this link
A Cosmos DB Container is not similar to a SQL Server Table. You can import data from multiple tables in a database into a single Cosmos DB container.

Creating a Graph Collection with an existing CosmosDB Account

I have an existing CosmosDB Account which was originally set up for the SQL api. I would like to create a graph but keep/manage it under the same database account.
I would like to use the Gremlin.Net sdk which (if I understand properly) requires the 'https://my-account.gremlin.cosmosdb.azure.com:443/' endpoint. Do all collections have this endpoint, or only collections created with an account targeting the gremlin api?
A better way of doing this would be
Create a new Graph account ( this will create the required gremlin server endpoint)
Migrate existing data using graph bulk executor.
Jayanta

Using Azure Data Factory, how to extract data from arrays in documents of DocumentDB to SQL Database

I need to extract arrays from documents in a DocumentDB and copy to SQL Database using Azure Data Factory..
I need to implement the same functionality of using jsonNodeReference and jsonPathDefinition in "Sample 2: cross apply multiple objects with the same pattern from array" of this article:
https://learn.microsoft.com/en-us/azure/data-factory/data-factory-supported-file-and-compression-formats#json-format
According to your mentioned File and compression formats supported by Azure Data Factory, it seems that it is not supported to extract data from documentdb to SQL database with Azure Data Factory Copy Activity currently. We could give our feedback to Azure document Team.
This topic applies to the following connectors: Amazon S3, Azure Blob, Azure Data Lake Store, File System, FTP, HDFS, HTTP, and SFTP.
But we also could use custom activities in an Azure Data Factory pipeline or Azure WebJob/Function with customized logic to do that.
Some related documents:
How to to Query Azure Cosmos DB resources
How to operate Azure SQL Database

Resources