Does Rocksdb support any mechanism/API to export the saved records in text file like JSON or CSV? - rocksdb

I want to use RocksDB as a storage. I want to export the database to human redable format like Json or CSV. Does Rocksdb support that ? If yes, it also possible to import the data back in database ?

Related

Lightroom SQLite database binary XMP format

Lightroom catalog is a SQLite database. Some of the metadata values are stored under Adobe_AdditionalMetadata.XMP column which is a BLOB data type.
When I save this blob, it is some binary file that I have no idea how to convert to/from an editable form.
According to the documentation, xmp file has XML format
Here is the example of such blob from my database
I was advised on the Lightroom forums, such columns use non-standard SQLite compress module.
Here is the link to the SQLite Windows binaries recompiled with non-standard compress module: https://drive.google.com/file/d/1EuSB8SrOA2nAhwTqjI3V1xK44IyxI9gt/view?usp=sharing
It can extract the xmp properties
select uncompress(xmp) from Adobe_AdditionalMetadata where id_local = 4539794;
But cannot write it back yet.
update Adobe_AdditionalMetadata
set xmp = compress('some valid xmp string value')
where id_local = 4539794;
It executes successfully but you won't be able to read it back with uncompress() function.
That's because the library from the link above fixes only uncompress() function. I am working to fix the compress() function as well

specifying Azure blob virtual folder instead of file for ingesting into Kusto

Referring to the .ingest into table <tablename> feature , as per the documentation we need to specify direct file name (blob). But it is more common that we may have a bunch of text files in a given blob path , all of which need to be imported. Is there a way we can specify path? I have tried specifying but Kusto won't like folder path.
Kusto does not iterate over folders or containers.
Zip all your files up into a file. Place on blob. This [ingest into] command worked for me:
.ingest into table Blah (
h#'https://YOURACCOUNT.blob.core.windows.net/somefolder/FileFullofCsvs.zip;YOURKEY'
)
with (
format = "csv",
ignoreFirstRecord = true,
zipPattern="*.csv"
)
You can probably achieve this by creating external table referencing your blob storage folder.
Generate SAS token.
Generate SAS token for your blob storage folder. (ensure to select read and list permissions and any other appropriately)
Create external table
Here is the Kusto query
.create external table myExternalTable(ProductID:string, Name:string ,Description:string, ExpiryDate:datetime)
kind=blob
dataformat=csv
(
h#'https://{storageaccount}.blob.core.windows.net/{file system}/{folder name}?{SAS token url generated from step1}
)
Create Table in Azure Data Explorer DB
Set or Append data to Azure Data Explorer database table.
.set-or-append myProductTable (extend_schema=true) <|external_table("myExternalTable")
Query the table
This will list all the data rows in the table
myProductTable

Read a csv file and insert the values in mysql database using R

I am able to read a csv file using read function, i now want to insert the values into a table in mysql database,i have to make it dynamic so that if the content of the csv changes it can insert stil.
Your post is very subjective. I advise you to go in parts, read the dplyr documentation.
I use dplyr for persistence in mysql database. This a powerfull packpage.
https://shiny.rstudio.com/articles/pool-dplyr.html

export result into excel sheet from teradata sql assistant

I want to export the results into excel sheet by running the query in Teradata SQL Assistant.
I used copy paste but it didnt work
Thanks in advance.
If you return the answers to SQL Assistant you should be able to select Save Answerset from the File menu. You will then have the option to save it as a proper Excel file format.
If you export the answers to a flat file directly the delimited text file can in turn be opened with ease in Excel and then saved as a proper Excel file format (XLS, XLSX, etc.)
Select the whole excel worksheet you will paste into and set the number format to 'text'.
Now you can safely copy the data from the teradata sql assistant's query results and paste them into the spreadsheet.

Teradata SQL export to csv

Is there a way to run a Teradata SQL query and then export the data to an external file?
For example, if I ran:
SELECT TOP 10
*
FROM
mydb.mytable
Could I write this in such a way that it will export to a CSV? Do I need to store my data in a temp table first and then do CREATE EXTERNAL TABLE? Any ideas would be appreciated - I've been returning data to R and then exporting, but there's no need for the intermediate step in some jobs.
There's no CREATE EXTERNAL TABLE in Teradata.
The only tool capable of exporting CSV directly is TPT (Teradata Parallel Transporter).
Otherwise you have to do the concat in your query using SELECT TRIM(col1) || ',' || TRIM(col2)... and then export based on your client's capabilities.

Resources