Following: https://learn.microsoft.com/en-us/azure/cosmos-db/sql/create-sql-api-spark
But getting error while creating db on Cosmos
# Configure Catalog Api to be used
spark.conf.set("spark.sql.catalog.cosmosCatalog", "com.azure.cosmos.spark.CosmosCatalog")
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountEndpoint", cosmosEndpoint)
spark.conf.set("spark.sql.catalog.cosmosCatalog.spark.cosmos.accountKey", cosmosMasterKey)
# create a cosmos database using catalog api
spark.sql("CREATE DATABASE IF NOT EXISTS cosmosCatalog.{};".format(cosmosDatabaseName))
Error:
java.lang.RuntimeException: Client initialization failed. Check if the endpoint is reachable and if your auth token is valid. More info: https://aka.ms/cosmosdb-tsg-service-unavailable-java
Related
Project architecture
Background
Trying to connect a DB to my NextJS project. Decided to use https://www.mockaroo.com/ to generate mock DB. Then used the Planetscale browser console to create the table and insert the values with the SQL generated by mockaroo.
Successfully have a table in Planetscale
Problem
When doing npx prisma studio
ERROR
Message: Error in Prisma Client request:
Invalid `prisma.mOCK_DATA.findMany()` invocation:
Can't reach database server at `us-east.connect.psdb.cloud`:`3306`
Please make sure your database server is running at `us-east.connect.psdb.cloud`:`3306`.
When doing npx prisma db pull
ERROR
Prisma schema loaded from prisma\schema.prisma
Environment variables loaded from .env
Datasource "db": MySQL database "MydbName" at "us-east.connect.psdb.cloud:3306"
✖ Introspecting based on datasource defined in prisma\schema.prisma
Error: P1001
Please make sure your database server is running at `us-east.connect.psdb.cloud`:`3306`.
scheme.prisma
generator client {
provider = "prisma-client-js"
previewFeatures = ["referentialIntegrity"]
}
datasource db {
provider = "mysql"
url = env("DATABASE_URL")
referentialIntegrity = "prisma"
}
.env
DATABASE_URL="mysql://{username}:{password}#us-east.connect.psdb.cloud:3306/MydbName"
Summary
I cannot connect to the Planetscale DB from my NextJS project without an ERROR connecting. I have researched this issue online and found people with the issue. But applying their changes did not fix anything i.e. https://github.com/prisma/prisma/issues/5132. For instance I tried appending ?ssl_mode=require&sslcert==us-east-1-bundle.pem to the DATABASE_URL but no changes.
I will attempt this fix soon: How do I connect to a server with SSL from node.js on localhost?
I currently have an azure cosmos db that i want to export file from. I am using the Azure cosmos DB Data migration tool for this process and on the first screen i am trying to verify my connection string to the db but am getting the following error:
Error reading Object from jsonReader.Path,line 0,position 0
I have verify that my connection string is in the format required and i have try the different connection method available.
Note: I am getting the same error when trying to use cosmisclone app.
I have an Azure function that runs with a CosmosDBTrigger. It points correctly to my target database and collection. The CreateLeaseCollectionIfNotExists is set to true and the LeaseCollectionName is set to leases. When the function is started I receive this error:
Error indexing method ' * * '
Microsoft.Azure.WebJobs.Host.Indexers.FunctionIndexingException :
Error indexing method '***' ---> System.InvalidOperationException :
Cannot create Collection Information for *** in database
*** with lease leases in database *** : Partition
key path /id is invalid for Gremlin API. The path cannot be '/id',
'/label' or a nested path such as '/key/path'
It seems like Azure is creating the leases graph with an '/id' as a partition. Where did I go wrong?
The Azure Functions Cosmos DB Trigger documentation says it only works on SQL API accounts: https://learn.microsoft.com/azure/azure-functions/functions-bindings-cosmosdb-v2#supported-apis
Particularly the Trigger uses the Change Feed Processor library that was designed to work with SQL API accounts, it uses a lease collection that has the requirement of being partitioned by /id which is something that Gremlin API accounts cannot do.
I am trying to query a Cosmos MongoDB collection, I can connect to it fine with Robo3T and 3T Studio, and dotnet core mongo client (in a test harness). I can do a count of entities (db.[collection_name].count({})) in all of the platforms, but every query (db.[collection_name].find({}) fails with the following error :
Error: error: {
"_t" : "OKMongoResponse",
"ok" : 0,
"code" : 1,
"errmsg" : "Unknown server error occurred when processing this request.",
"$err" : "Unknown server error occurred when processing this request."}
Here is my sample query from Rob3T and below that sample .NET harness.. Doesn't matter what I use, same error every time.
db.wihistory.find({})
and the dotnet core code :
string connectionString = #"my connections string here";
MongoClientSettings settings = MongoClientSettings.FromUrl(
new MongoUrl(connectionString)
);
settings.SslSettings =
new SslSettings() { EnabledSslProtocols = SslProtocols.Tls12 };
var mongoClient = new MongoClient(settings);
var database = mongoClient.GetDatabase("vstsagileanalytics");
var collection = database.GetCollection<dynamic>("wihistory");
var data = collection.Find(new BsonDocument()).ToList();
System.Console.WriteLine(data.ToString());
The issue comes from mixing API usage in the account. As stated in the comments, you are using Azure Function's Cosmos DB Output binding, which uses the SQL API (.NET SDK for SQL API) to connect to the account and store data. There is a note in that documentation that says:
Don't use Azure Cosmos DB input or output bindings if you're using
MongoDB API on a Cosmos DB account. Data corruption is possible.
The documents stored through this method do not enforce certain MongoDB requirements (like the existence of a "_id" identifier) that a MongoDB client would (a MongoDB client would automatically create the "_id" if not present).
Robo3T and other Mongo clients (including the Azure Portal) are failing to correctly parse and read the stored documents as valid MongoDB documents (due to the lack of requirements like "_id") and that is the cause of the error.
You can either switch to use a Cosmos DB SQL API account if you want to maintain the Azure Functions pipeline or change the output binding and replace it with a manual implementation of a MongoDB client.
I have a ASP.NET application that uses application services to manage user roles. This application worked fine when I was using a local database. When I transitioned to Azure, the main data tables still work but I am unable have user's login or create new accounts because the Application table cannot be found. Previously, the application services created these tables automatically when using a local database. The connection string name appears accurate. This is my first attempt at Azure...is there something I'm missing?
The error is "The entity type Application is not part of the model for the current context"
For additional context: When I try to create the membership tables in the Azure database using aspnet_regsql.exe, I get the following error: "An error occurred during the execution of the SQL file 'InstallCommon.sql'. The SQL error number is 40508 and the SqlException message is: USE statement is not supported to switch between databases. Use a new connection to connect to a different database."