I am trying to access Azure Table Storage via python.
Following an old walkthrough here:
https://learn.microsoft.com/en-us/azure/cosmos-db/table-storage-how-to-use-python#install-the-azure-storage-sdk-for-python
but the Python SDK it references for Azure Tables specifically (https://github.com/Azure/azure-storage-python) has been moved/deprecated in favor of Azure Cosmos DB SDK.
In the deprecation note, they say to use this SDK:
https://github.com/Azure/azure-cosmosdb-python
In the documentation for that SDK, they refer you to https://azure.microsoft.com/en-us/develop/python/
In the Table Storage and link on that page, it refers you back to the first link (!!)
============
1) All I want to do is query traditional Azure Table Storage (NOT CosmosDB) with a Python SDK
2) Ideally, that Python SDK also includes the encryption/decryption capability for Azure Tables.
What am I missing / does that python SDK still exist anywhere?
Note:
I see https://github.com/Azure/azure-cosmosdb-python/tree/master/azure-cosmosdb-table
but this SDK seems to require a CosmosDB deployment -- it can't connect to traditional AzureTables. Is my understanding incorrect?
Thanks for any help you can offer.
The Azure CosmosDB Table SDK IS Azure Storage Tables SDK. Re-branding is part of some re-org inside Microsoft, but this is the same code and same endpoint, same everything.
Storage SDK was one big client, it was split into Table/Queue/Blog/Files packages, in order to give ownership of Table to CosmosDB team.
https://learn.microsoft.com/en-us/azure/cosmos-db/table-support
The new Azure Cosmos DB Python SDK is the only SDK that supports Azure
Table storage in Python. This SDK connects with both Azure Table
storage and Azure Cosmos DB Table API.
You can also compare the code, you'll see:
https://github.com/Azure/azure-storage-python/tree/v0.36.0/azure/storage/table
https://github.com/Azure/azure-cosmosdb-python/tree/master/azure-cosmosdb-table/azure/cosmosdb/table
(I work at MS in the Azure SDK for Python team)
Azure Table Storage has a new python library in preview release that is available for installation via pip. To install use the following pip command
pip install azure-data-tables
This SDK is able to target either a Tables or Cosmos endpoint (albeit there are known issues with Cosmos).
For your use case of querying an Azure Table Storage account, there's two query methods.
Querying a single table:
from azure.data.tables import TableClient
table_client = TableClient.from_connection_string(conn_str, table_name="myTableName")
query_filter = "RowKey eq 'row_key_5'"
for entity in table_client.query_entities(filter=query_filter):
print(entity)
Querying a storage account for tables:
from azure.data.tables import TableServiceClient
table_service_client = TableServiceClient.from_connection_string(conn_str, table_name="myTableName")
query_filter = "TableName eq 'myTable'"
for table in table_service_client .query_entities(filter=query_filter):
print(table.table_name)
For more samples on the library check out the samples hosted on the Azure GitHub Repository.
(FYI I work at Microsoft on the Azure SDK for Python team)
Indeed , Azure hides part of the Table Storage SDK guided links facilitate the promotion of the Cosmos DB Table API. As you mentioned in your answer, the Azure Storage SDK is now incorporated into the Cosmos menu.
However , I found the old Azure Table Storage Python SDK from previous version in the repo.
You could refer to the above link even if it's no longer updated.
By the way, you could see the benefits by moving to Azure Cosmos Table API from Azure Table Storage from this link.
Hope it helps you.
Related
I need to turn on Cosmos Db Diagnostic Full-text Query as part of deployment pipeline. Ideally would prefer this being part of ARM template, if not then CLI or PowerShell would do. Couldn't find any documentation on this, has anybody had it done?
I also had exported ARM template for the Cosmos Db account with this setting On in portal, couldn't find anything relevant in the template so redeployed the template and the setting was Off when deployed.
Thanks in advance!
I am using gcloud + firebase cli to explore a reproducible way to create and configure a GCP + Firebase project.
I have created the GCP project using gcloud cli tool. I have then used firebase cli to run the command firebase init firestore.
Ulimately it ended up outputting...
Error: It looks like you haven't used Cloud Firestore in this project before. Go to https://console.firebase.google.com/project/my-project/firestore to create your Cloud Firestore database.
Is there a way I can "create my firestore database" using a cli tool or scripting api, instead of having to navigate to a web GUI tool and manually execute steps?
Unless Firebase do more behind the scenes, creating a Firestore instance could be as simple as this command from the GCloud CLI reference:
gcloud firestore databases create --region=us-central --project=my-project
The documentation for the entire create-provision cycle (from GCP's perspective) is here - including Terraform details (some commands may have been released since the docs were written).
We have a need to pre-populate Cosmos DB containers with some static json files. This is a requirement for both local developer Cosmos DB emulator environments, and also Azure DevOps deployments. Ideally those two scenarios would use the same approach of course.
One way I was thinking was to have static json documents in our git repo, and to have a dotnet core command line tool that would connect to Cosmos DB and insert one or more docs to a specified DB and container, per invocation of the console app.
I found this tool which seems like a good fit:
https://github.com/azure/azure-documentdb-datamigrationtool
https://learn.microsoft.com/en-us/azure/cosmos-db/import-data
However, this targets .NET Framework 4.5, and therefore cannot be used easily by our Mac and Linux developers. So one option would be to have a go at migrating that tool to dotnet core.
I also found these bash scripts that seem relevant:
https://github.com/Krumelur/AzureScripts/blob/master/cosmosdb_create_document.sh
Use bash, Azure CLI and REST API to access CosmosDB - how to get token and hash right?
I.e. Windows users could use WSL to run these.
However, I think a dotnet core console app would be the ideal solution here. It seems like an obvious simple tool to want, so was wondering if there is anything already out there.
Or maybe am I thinking about this problem the wrong way?
There isn't anything out there today but the first link you referenced there to the DMT running on .NET 4.5 is being transitioned to a new maintainer, Solliance, who is going to be ported over to .NET Core but there is no ETA as of yet.
The only thing I can suggest is to roll your own app to read from blob storage and insert into Cosmos. The other possible option too is to use Azure Data Factory and create/update a job with new endpoint and keys when you roll a new environment.
Context, I have a project with datastore which already has information loaded, currently we wanted to use cloud firestore (native mode), but we realized that migration is not possible, what alternatives do I have to use cloud firestore (native mode)?
Update June 16, 2021:
You can now do gcloud datastore export in your first project, followed by gcloud firestore import in your new project. The longer more involved migration below is no longer needed.
Just keep in mind that the Datastore export goes to a Cloud Storage bucket. Make sure that the account running the Firestore import has access to that bucket.
Original answer from 2019
I just migrated from Datastore to Firestore (native mode) for one of my web apps. Here is what I needed to do:
Create a new GCP project, as Firestore (native mode) and Datastore
can't co-exist in the same project.
Migrate the data from Datastore in my old project to Firestore (native mode) in my new project. As of this writing, there are no tools to do that in an automatic way. I wrote Python scripts that read all records from Datastore and wrote them to Firestore in the new project. These scripts ran locally on my machine, using service account keys downloaded from the Cloud Console.
(Side note: You might be tempted to use gcloud datastore export followed by gcloud firestore import. It seems to work and no error messages pop up when you do. But doc IDs and JSON properties don't translate well. This was a big time-sink for me. Don't go down this road.)
Rewrite the data access layer in your app. Firestore (native mode) has a different API than Datastore.
This was a fair amount of work, but it was worth it in my case:
I was able to retire a lot of server-side code because the clients can access the database directly.
I was able to retire a lot of client-side code for supporting offline mode because the Firestore client library implements it already.
Best of luck!
Unfortunately, you'll need to create your Cloud Firestore database in a new project that allow your existing service accounts to access that new database.
I'm using flask on appfog.com to make a personal blog. Today I tried to use sqlite. I can run the application locally with sqlite but when I update the app to AppFog, it does not seem to work. I can't find how to use sqlite in AppFog's docs. Can anyone tell me?
Thanks...
Sorry for my poor english:-)
It's not recommended to use sqlite for your production apps on AppFog because the file storage is ephemeral. Every time you update your app the database will get blown away. You're better off creating and binding a postgres, mysql, or mongodb database service for your app. You can continue to use sqlite db locally but your production app will use the bound service.
See the Bind Service section of: https://docs.appfog.com/languages/python/flask