Azure Data Explorer write query result to cosmo db - azure-cosmosdb

I have a azure data-explor that queries some syslogs coming-in, filters and aggregate them. The output of this query is store on my local computer in a csv file. So every time a run my Python SDK, it runs a query and saves the output in a csv file.
What I am looking for, is to push that result of the query to a cosmosdb.
Looking into azure GitHub azure-sdk-for-python, I found a library that can achieve this result with this code.
from azure.cosmos import CosmosClient
import os
url = os.environ['ACCOUNT_URI']
key = os.environ['ACCOUNT_KEY']
client = CosmosClient(url, credential=key)
database_name = 'testDatabase'
database = client.get_database_client(database_name)
container_name = 'products'
container = database.get_container_client(container_name)
for i in range(1, 10):
container.upsert_item({
'id': 'item{0}'.format(i),
'productName': 'Widget',
'productModel': 'Model {0}'.format(i)
}
)
But I am a bit confused because they mention container.
I was wondering if there is a way that I can push my query result to a database or table using Python SDK.
Thank you so much for your time and help

In Cosmos DB terminology, Container is equivalent to a Table as Container holds the data like Table. If you're coming from a relational database world, here's the mapping (kind of):
Database Server --> Cosmos DB Account
Database --> Database
Table --> Container

Related

exporting query result in Kusto to a storage account

I get below error on running my kusto query(which i am still learning) in Azure Data Explorer. and the reason is result is more than 64 mb.. Now when I read the MS doc they mentioned we have something called Export which we can use to export the query result in a storage blob.. However I couldnt find any example as how we can do it? Did someone tried it? Can they provide a sample as how we can export to a storage account when my resulted set is more than 64 mb?
Here is what I have for now, I have data in an external table which I would like to query. so for example if my external table name is TestKubeLogs and I am quyering
external_table("TestKubeLogs")
| where category == 'xyz'
and then I get error as
The Kusto DataEngine has failed to execute a query: 'Query result set has exceeded the internal data size limit 67108864 (E_QUERY_RESULT_SET_TOO_LARGE).'
so now I am trying to export this data. How should I do it. I started writing this but how do I specify the category and table name.
.export
async compressed
to json (
h#"https://azdevstoreforlogs.blob.core.windows.net/exportinglogs;mykey==",
)
You are not in the right direction.
Regardless the source of your data, whether you're querying an external table or managed table, it does not make sense to pull large data volume to your client.
What would you do with millions of records presented on your screen?
This is something that is true to all data systems.
Please read the relevant documentation:
https://learn.microsoft.com/en-us/azure/data-explorer/kusto/concepts/resulttruncation

SQLite Importer will overwrite my database when I load my application?

I have an Ionic App using SQLite. I don't have any problems with implementation.
The issue is that I need to import an SQL file using SQLitePorter to populate the database with configuration info.
But also, on the same database I have user info, so my question is:
Everytime I start the app, it will import the sql file, fill the database and probably overwrite my user data too? Since it is all on the same base?
I assume that you can always init your table using string queries inside your code. The problem is not that you are importing a .sql file. Right?
According to https://www.sqlitetutorial.net/sqlite-create-table/ it is obvious that you always create a table with [IF NOT EXISTS] switch. Writing a query like :
CREATE TABLE [IF NOT EXISTS] [schema_name].table_name (
column_1 data_type PRIMARY KEY);
you let sqlite to decide if it's going to create a table with the risk to overwrite an existing table. It is supposed that you can trust that sqlite is smart enough, not to overwrite any information especially if you use 'BEGIN TRANSACTION' - 'COMMIT' procedure.
I give my answer assuming that you have imported data and user data in distinct tables, so you can manipulate what you populate and what you don't. Is that right?
What I usually do, is to have a sql file like this:
DROP TABLE configutation_a;
DROP TABLE configutation_b;
CREATE TABLE configutation_a;
INSERT INTO configutation_a (...);
CREATE TABLE configutation_b;
INSERT INTO configutation_b (...);
CREATE TABLE IF NOT EXIST user_data (...);
This means that every time the app starts, I am updating with the configuration data I have at that time (that's is why we use http.get to get any configuration file from a remote repo in the future) and create user data only if user_data table is not there (hopefully initial start).
Conclusion: It's always a good practice, in my opinion, to trust a database product 100% and abstractly let it do any transaction that might give you some risk if you implemented your self in your code; since it gives a tool for that.For example, the keyword [if not exists], is always safer than implementing a table checker your self.
I hope that helps.
PS: In case you refer in create database procedure, SQLite, connects to a database file and it doesn't exist, it creates it. For someone comfortable in sqlite command line, when you type
sqlite3 /home/user/db/configuration.db will connect you with this db and if the file is not there, it will create it.

specifying Azure blob virtual folder instead of file for ingesting into Kusto

Referring to the .ingest into table <tablename> feature , as per the documentation we need to specify direct file name (blob). But it is more common that we may have a bunch of text files in a given blob path , all of which need to be imported. Is there a way we can specify path? I have tried specifying but Kusto won't like folder path.
Kusto does not iterate over folders or containers.
Zip all your files up into a file. Place on blob. This [ingest into] command worked for me:
.ingest into table Blah (
h#'https://YOURACCOUNT.blob.core.windows.net/somefolder/FileFullofCsvs.zip;YOURKEY'
)
with (
format = "csv",
ignoreFirstRecord = true,
zipPattern="*.csv"
)
You can probably achieve this by creating external table referencing your blob storage folder.
Generate SAS token.
Generate SAS token for your blob storage folder. (ensure to select read and list permissions and any other appropriately)
Create external table
Here is the Kusto query
.create external table myExternalTable(ProductID:string, Name:string ,Description:string, ExpiryDate:datetime)
kind=blob
dataformat=csv
(
h#'https://{storageaccount}.blob.core.windows.net/{file system}/{folder name}?{SAS token url generated from step1}
)
Create Table in Azure Data Explorer DB
Set or Append data to Azure Data Explorer database table.
.set-or-append myProductTable (extend_schema=true) <|external_table("myExternalTable")
Query the table
This will list all the data rows in the table
myProductTable

how to open sqlite database in react native

im using sqlite database in my react native project
i made app_db database that contain users table ,
and the path of database :
/home/dina/AppointmentApp/android/app/src/main/assets/app_db
now i want to access to the users table in my project and execute some queries like select , update, insert ...etc
here is the code of open connection
var db = SQLite.openDatabase({name : "app_db", createFromLocation : "~app_db"});
db.transaction((tx)=>{
tx.executeSql("select * from users",(tx,results) => {
});
});
i dont know if this query is correct or not
it is return in console
-OPEN database: app_db
-{message: "no such table: users (Sqlite code 1): , while comp…m users, (OS error - 2:No such file or directory)", code: 0}
I know this is old, but as per the error code: your database does not have the table "users".
If you are certain that it does, then perhaps the path you point to is wrong, and the database is created automatically (without a users table).
See here for a similar problem: No Such Table error

how to let users to select columns in a database table

We are trying to build an web application that allows users to select appropriate columns for given database table.
I wonder if there is any API for these - I've googled it a while in vain. Otherwise, if you can give some clues (patterns or sample codes) how to build such a component, that will be great and appreciated.
You could base your application on INFORMATION_SCHEMA views/table. This is documentation for SQL Server, but you can easily find it for other databases too:
http://msdn.microsoft.com/en-us/library/ms186778.aspx
Sample SQLs:
select * from INFORMATION_SCHEMA.TABLES
select * from INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'users'
if you want to use this solution with many databases, to separate your application from db engine, you can create about defining IMetadataProvider interface and create implementations for different databases:
interface IMetadataProvider {
...GetTables();
...GetTableColumns();
...GetTableRelations();
//Other functions required by your project
}
You can also create your own query builder interface:
interface IQueryBuilder {
...From(string tableName);
...Top(int numberOfRows); //TOP for SQL SERVER, LIMIT for MySQL
}

Resources