Is a SQL Azure Table similar to an Azure Cosmos DB Container? - azure-cosmosdb

I am using the Cosmos DB SQL API. But I am confused how to migrate my existing SQL DB into Azure Cosmos DB.
Is an SQL Table similar to a Cosmos DB Container. Or do we need to store all SQL table data into one container with different Partition keys?

Do not be fooled by the name. The Cosmos DB SQL API does not allow you to work with Cosmos DB as though it were a relational database.
It is fundamentally a JSON document database, for storing items in a container. It is schema-less. Whilst you can import data after a fashion (as #Bob linked), you don't end up with relational tables - it's all JSON documents.
The SQL API allows you to use a SQL like syntax to query the JSON structure - the semantics however, are all based on these hierarchically structured documents, allowing you to return arrays of the JSON documents or projections of them.
Queries always run in the context of a specific container.
You can't JOIN across documents for example - you use JOINs to self-join within individual documents. There is basic aggregation across documents, and some limited grouping functionality.
The semantics are very different from SQL Azure; you need to think differently.

Cosmos DB Data Migration tool can be used to import data from SQL Server into Cosmos DB. Refer this link
A Cosmos DB Container is not similar to a SQL Server Table. You can import data from multiple tables in a database into a single Cosmos DB container.

Related

Manage Partition Key in Azure Cosmos DB with C# SDK

Is there any way to manage an Azure Cosmos DB collection with partition key that uses the Mongo API directly from c#?
Currently we use Terraform to provision the Cosmos DB account and the database and we use the MongoDB.Driver to administrate collections. To get the Collection (and create it if it does not exist), we use the following syntax:
public IMongoCollection<MyDocument> MyDocumentsCollection =>
Database.GetCollection<MyDocument>("MyDocuments", mongoCollectionSetting);
I do not see any option to set the partition key for the collection and I was expecting to accomplish this with the mongoCollectionSettings. What is the best option to get this working?
I have found the Microsoft.Azure.Cosmos SDK but this is only applicable for Cosmos DB's SQL API.
Also I don't want to start with the method RunCommand but I guess this is the only option...?! Is it?

I have a requirement to load data from azure cosmos db (json format) into snowflake. what are my options?

I tried using nodeJS function to trigger on cosmos change feed and insert into snowflake table columns directly but that seems to be slow. please suggest any other options
You can use ADF that has both snowflake and Azure Cosmos db Connector. Azure Cosmos db will be your source and Snowflake can be used as sink.
Create a linked service to Azure Cosmos DB using UI
Snowflake as sink
Good part is Azure Cosmos db Change feed is now supported in ADF, so this can also be integrated through ADF.
Native change data capture (CDC) - CosmosDB change feed is supported in ADF now

Cosmos DB hybrid data APIs access (MongoDB & Gremin)

Is it possible to add data with the MongoDB APIs, then add edges between the documents with Gremlin, and query the same data with both APIs?
This is not possible today due to MongoDB's use of BSON which is not standard JSON. It is technically possible however to have SQL and Gremlin API within Cosmos DB.

Deleting documents from Cosmos Db collection during incremental load using Azure Data factory

I am trying to copy data from Azure SQL to Azure Cosmos Db Collection (Mongo API). I am using upsert to insert/update the documents on Cosmos Db. However, if any data from source is deleted, how do I delete that document from Cosmos Db.
As far as I am aware,it is impossible to deleting documents from Cosmos Db collection during incremental load.
Because,incremental load is comparing LastModifytime.If you delete rows in azure sql,them don't exist in source and copy data only supports insert and update.
If you want to synchronized your data,please delete them in cosmos db manually.
You can run the delete sql in cosmos db or add a column DeleteStatus.When you want to delete data,update DeleteStatus and LastModifytimethen incremental load.Finally,run the sql both in cosmos db and azure sql:
delete from xxxx where DeleteStatus = 1
Hope these can help you.

Using Azure Data Factory, how to extract data from arrays in documents of DocumentDB to SQL Database

I need to extract arrays from documents in a DocumentDB and copy to SQL Database using Azure Data Factory..
I need to implement the same functionality of using jsonNodeReference and jsonPathDefinition in "Sample 2: cross apply multiple objects with the same pattern from array" of this article:
https://learn.microsoft.com/en-us/azure/data-factory/data-factory-supported-file-and-compression-formats#json-format
According to your mentioned File and compression formats supported by Azure Data Factory, it seems that it is not supported to extract data from documentdb to SQL database with Azure Data Factory Copy Activity currently. We could give our feedback to Azure document Team.
This topic applies to the following connectors: Amazon S3, Azure Blob, Azure Data Lake Store, File System, FTP, HDFS, HTTP, and SFTP.
But we also could use custom activities in an Azure Data Factory pipeline or Azure WebJob/Function with customized logic to do that.
Some related documents:
How to to Query Azure Cosmos DB resources
How to operate Azure SQL Database

Resources