I want to download the data as a json object after sorting the table - angular12

I have an angular project in which I have successfully applied the sorting with the mat table what I want to do is that I want to download the data as a json object after sorting. Is that possible to do the same? If so how can i do the same?

You can use this library mat-table-exporter it will help you in exporting the data from the material table in CSV, Excel, TXT, and JSON formats. Install it using:
npm i mat-table-exporter

Related

Update a CSV table in SQLite Studio

I have a bunch of CSVs that get updated locally on my computer every few days. I want to refresh them in SQLite Studio but I can't find out where to actually refresh. Is there an option to do this? The only way i've been able to refresh is to fully delete the table, and then re-import it under the same name (so the query still works). All of the CSVs and Sqlite Studio are local on my computer I am not running anything remote.
CSV file is not linked in any way with SQLiteStudio. Once you import data to table, it is in table, not in CSV file. If you want to refresh contents of table with data from CSV files, then you need to do exactly what you already do, that is re-import.
An useful tool to make this repeatable task less clumsy is import() SQL function built in SQLiteStudio. You can easily delete old data and re-import new one in single execution:
delete from your_table;
select import('path/to/file.csv', 'CSV', 'your_table', 'UTF-8');
Of course you need to adjust your parameters. Also there can be 5th (optional) parameter specifying importing options, just like in Import Dialog. Quoting from User Manual (https://github.com/pawelsalawa/sqlitestudio/wiki/User_Manual#built-in-sql-functions):
charsets() Returns list of charsets supported by SQLiteStudio (to be used for example in arguments for import() function)
import_formats() Returns list of importing formats supported by SQLiteStudio (depends on import plugins being loaded)
import_options(format) Returns list of currently used importing settings for certain format (the format must be one of formats returned from import_formats()). Each setting in a separate line. Each line is a setting_name=setting_value
import(file, format, table, charset, options) Executes importing process using file for input, format for choosing import plugin (must be one of values returned from import_formats()). The import is done into the table. If table does not exists, it will be created. The charset is optional and must be one of values returned from charsets() (for example 'UTF-8'). It defaults to UTF-8. The options is optional and has to be in the same format as returned from import_options() (which is one option per line, each line is option_name=value), although it's okay to provide only a subset of options - then the rest of settings will remain.

Firebase DB imports JSON data in alphabet order

when I import JSON data into firebase using import option on the GUI, the data is loaded into firebase in alphabet order. But I need data to be loaded in the order I have in JSON. Does anyone know a way to load JSON data into firebase DB in the order the data in JSON file.
The console always shows children in that order. That's just how the console works.
If you need ordering in your app, your query should indicate the order of results using one of the ordering methods.

Add file name as column in data factory pipeline destination

I am new to DF. i am loading bunch of csv files into a table and i would like to capture the name of the csv file as a new column in the destination table.
Can someone please help how i can achieve this ? thanks in advance
If you use a Mapping data flow, there is an option under source settings to hold the File name being used. And later in it can be mapped to column in Sink.
If your destination is azure table storage, you could put your filename into partition key column. Otherwise, I think there is no native way to do this with ADF. You may need custom activity or stored procedure.
A post said the could use data bricks to handle this.
Data Factory - append fields to JSON sink
Another post said they are using USQL to hanlde this.
use adf pipeline parameters as source to sink columns while mapping
For stored procedure, please reference this post. Azure Data Factory mapping 2 columns in one column

Export hive query output to multiple tabs in excel

When running: hive -e "query" > mysheet.xls, I am able to export the output to a new excel file.
Could you help me with exporting another hive query into already created excel file into different excel sheet? (not overwriting the existing file/data).
Is this possible with hive query? Please help.
The issue here is that you are using stdout redirect >. That will always create a new file. If you use the adding redirect >> that will add to your current file (not creating a new tab in Excel).
That said your query is probably creating a csv file which can be then opened in Excel.
If you are satisfied with your results I recommend generating multiple csv files via script and then merging it either with other script into one big excel file or use directly Excel to merge multiple csv files.
You can use multiple ways to merge it with Excel - one of the possible way is here

Can we specify data types for properties while importing data using neo4j-import tool?

I've imported lot of data using neo4j-import tool only to be useless as the default data type is string for all columns. I'm not able to perform any aggregations on data. However, I'm able to change the data type using update commands but this is a lot of overhead.
Is it possible specify data types importing data itself using neo4j-import tool?
You should be able to use toInt(), toFloat(), and toString(). If you need booleans, you can do a comparison to get a boolean output.
Ah, this is assuming that you have access to Cypher in the import.
If you're using the 2.2.0 (and probably 3?) import, you should be able to define the type in the header like so propertyName:int
See the property types part of the 2.2.0 import tool docs, and the CSV header format sections.

Resources