Load data to Azure Cosmos DB (Mongo API) - azure-cosmosdb

Not able to write data to Azure Cosmos DB (Mongo API) by using below PowerShell script. I want to see $output data in Cosmos DB (Mongo API).
$Params = #{
"URI" = 'https://3ea5e53b-817e-4c41-ae0b-c5afc1610f4e-bluemix.cloudant.com/test/_all_docs?include_docs=true'
}
$output = Invoke-RestMethod #Params | ConvertTo-Json -Depth 11

It is not possible to write to Cosmos DB's MongoDB API using PowerShell. You need to use one of the MongoDB Drivers to read and write to Cosmos DB.

Related

Does the EF Core Cosmos DB provider support the new Bulk API for Cosmos DB?

We are using the latest EF Core 3.1 and the Cosmos DB provider. We would like to have bulk inserts into Cosmos DB and do not know if the latest EF Core provider supports the new Bulk API for Cosmos DB.
Does the EF Core Cosmos DB provider support the new Bulk API for Cosmos DB?
We have looked at https://github.com/dotnet/efcore, but have not been able to determine that this is supported. We have also looked at https://learn.microsoft.com/en-us/ef/core/providers/cosmos/.
It doesn't include the bulk support that is found in the native Cosmos .NET sdk v3. You'll have to use the native .NET SDK v3 to do bulk operations. We have a blog post from April that has more details.
Entity framework does not provide this implicitly but you can get cosmos client from DBContext class and do it explicitly.
Here's example:
CosmosClient cosmosClient = _context.Database.GetCosmosClient();
Database database = cosmosClient.GetDatabase("PersonalizationOrder");
Container container = database.GetContainer(typeof(T).Name);
var data = await container.CreateItemAsync(new Order
{
Id = Guid.NewGuid(),
PartitionKey = "1",
ShippingAddress = new StreetAddress()
{
City = "XYZ",
Street = "Street, ttt st"
}
});

Sql api is not supported for this database Error

I'm trying to execute a query in Cosmo DB Mongo API, using the Cdata ODBC through Python. Below is the driver configuration:
[CData ODBC Driver for Cosmos DB]
Description=CData ODBC Driver for Cosmos DB 2019
Driver=/opt/cdata/cdata-odbc-driver-for-cosmosdb/lib/libcosmosdbodbc.x86.so
UsageCount=1
Driver64=/opt/cdata/cdata-odbc-driver-for-cosmosdb/lib/libcosmosdbodbc.x64.so
This is the code I'm using to make the query:
import pyodbc
cnxn = pyodbc.connect("DRIVER={CData ODBC Driver for Cosmos DB};AccountEndpoint=https://[myhost].com:443/;AccountKey=[mypass];")
cursor = cnxn.cursor()
# Find Schemas
cursor.tables()
for (catalog, schema, table, table_type, description) in cursor:
print("Catalog: {}, Schema: {}, Table: {}, Type: {}".format(
catalog, schema, table, table_type
))
# Execute Query
cursor.execute("SELECT luistest from luistest")
rows = cursor.fetchall()
for row in rows:
print(row.luistest)
When I execute it, the query of the tables and schemes returns good, but when I consult the documents I receive the following error:
Catalog: CData, Schema: luis-test, Table: luistest, Type: TABLE
Traceback (most recent call last):
File "mongo_odbc_test.py", line 11, in <module>
cursor.execute("SELECT luistest from luistest")
pyodbc.Error: ('HY000', '[HY000] [Forbidden] Sql api is not supported for this database account\r\nActivityId: 66808c80-83b6-4694-99ac-295693b8f51d, Microsoft.Azure.Documents.Common/2.5.1. (-1) (SQLExecDirectW)')
I have a student Azure account, could this affect? Is it possible to make a query without SQL with this tools?
Thanks.
This tool seems to be using the SQL API to run SQL Queries.
If your Cosmos DB account is using the Mongo API, you should be using tools and drivers that use the Mongo API.
If you are going to use this tool as your main development/use case, I would argue that Mongo API account might not be the correct choice as you have no Mongo requirements, just create a SQL (Core) API account.

Azure Cosmos DB - Gremlin API to clone existing collection into another collection

I have created a gremlin api database in Azure Cosmos DB and have data in one collection.
However, I want to know if there is a way to clone the data into another collection in another database.
I want to copy graph data from Dev environment to stage and prod environments.
You can use existing tools for cosmos SQL API(earlier known as documentdb), cosmosdb allows you to query graph via sql API as well
Something like "select * from c" can fetch you the json representation of how cosmosdb stores your graph data.
The simplest approach would be using cosmosdb migration tool:
Set input source as Cosmos SQL API/Documentdb, and use your dev endpoint with the following query select * from c
Set output type as json and export your data
Now use the downloaded json as input source and set your prod graph db as your output(choose documentdb/cosmos SQL API as output type) and run it.
This should push your dev graph data to prod.
You can also use other Azure tools such as data factory, which work with documentdb
Just used this CosmicClone to clone a cosmos db graph database form one account to another https://github.com/microsoft/CosmicClone. Cloned 500k records in 20mins. Looks like it would work with a DB to clone a collection.

How do you connect to a Cosmos Db (primarily updated via SQL API) using Gremlin.Net ? (can you?)

Im working on a Cosmos DB app that stores both standard documents and graph documents. We are saving both types via the documentdb api and I am able to run graph queries that return Graphson using the DocumentClient.CreateGremlinQuery method. This graphson is to be read by a web app and the graph displayed for user viewing and so on.
My issue is that I cannot define the version of the Graphson format returned when using the Microsoft.Azure.Graphs method. So i looked into Gremlin.net and that has a lot more options in this regard from the documentation.
However I am finding connecting to the Cosmos Document Db using gremlin.net difficult. The server variable which you define like this :
var server = new GremlinServer("https://localhost/",8081 , enableSsl: true, username: $"/dbs/TheDatabase/colls/TheCOllection", password: "C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==");
then results in a uri that has "/gremlin" and it cannot locate the database end point.
Has anyone used Gremlin.net to connect to a Cosmos document database (not a Cosmos db configured as a graph db) that has been setup as a document db not a graph db ? The documents in it are graph/gremlin compatible in their format with _isEdge / label / _sink etc.
Cheers,
Mark (Document db/Gremlin/graph newbie)

Cosmos Mongodb query fails but azure storage explorer works fine?

I am trying to query a Cosmos MongoDB collection, I can connect to it fine with Robo3T and 3T Studio, and dotnet core mongo client (in a test harness). I can do a count of entities (db.[collection_name].count({})) in all of the platforms, but every query (db.[collection_name].find({}) fails with the following error :
Error: error: {
"_t" : "OKMongoResponse",
"ok" : 0,
"code" : 1,
"errmsg" : "Unknown server error occurred when processing this request.",
"$err" : "Unknown server error occurred when processing this request."}
Here is my sample query from Rob3T and below that sample .NET harness.. Doesn't matter what I use, same error every time.
db.wihistory.find({})
and the dotnet core code :
string connectionString = #"my connections string here";
MongoClientSettings settings = MongoClientSettings.FromUrl(
new MongoUrl(connectionString)
);
settings.SslSettings =
new SslSettings() { EnabledSslProtocols = SslProtocols.Tls12 };
var mongoClient = new MongoClient(settings);
var database = mongoClient.GetDatabase("vstsagileanalytics");
var collection = database.GetCollection<dynamic>("wihistory");
var data = collection.Find(new BsonDocument()).ToList();
System.Console.WriteLine(data.ToString());
The issue comes from mixing API usage in the account. As stated in the comments, you are using Azure Function's Cosmos DB Output binding, which uses the SQL API (.NET SDK for SQL API) to connect to the account and store data. There is a note in that documentation that says:
Don't use Azure Cosmos DB input or output bindings if you're using
MongoDB API on a Cosmos DB account. Data corruption is possible.
The documents stored through this method do not enforce certain MongoDB requirements (like the existence of a "_id" identifier) that a MongoDB client would (a MongoDB client would automatically create the "_id" if not present).
Robo3T and other Mongo clients (including the Azure Portal) are failing to correctly parse and read the stored documents as valid MongoDB documents (due to the lack of requirements like "_id") and that is the cause of the error.
You can either switch to use a Cosmos DB SQL API account if you want to maintain the Azure Functions pipeline or change the output binding and replace it with a manual implementation of a MongoDB client.

Resources