Transform and Send CSV file to WSO2 DSS service - wso2-data-services-server

I have a CSV file. I want to send the data in CSV to database using a DSS service.
How can I use VFS transport for that? Or Do I need to use another mediator for that?

The easiest way would be to have a Data Service that inserts a record into your table.
Then a proxy that deals with the file, the proxy can use VFS to read the file, then a Smooks mediator to transform the CSV to XML.
From there the proxy can iterate over the records from the CSV file and send them to the dataservice.

Related

Consuming Client-Side Encrypted Data from Snowflake

I am trying to ingest client side encrypted data files from S3 to Snowflake, and want to query the data in Snowflake in readable format using Snowflake SQLs.
I have encrypted the data file using AES 256 and placed in S3. Also, followed the pre requisites of setting up my external stage with MASTER_KEY (AES-256, base64 encoded). However when I read data, it does not show me in readable format.
I would like to know if client side encrypted data can be read in the clear in Snowflake with the right authentication and authorization (without having to unload them back to S3).
Thanks in advance.

Transferring encrypted data using Sqoop

Is this use case possible:
To first extract data, encrypt it, transfer it over the network, decrypt it and load in hive or HDFS using Sqoop?
You can achieve this by following below steps :
Use sqoop codegen tool to generate mapper code which handles deserialization of table data.
Modify this code to encrypt the data read from table. Each instance represents one row.
Now run sqoop import command which will use this modified mapper code to generate encrypted data. This is transmitted to hdfs.
Use decryption logic over output files in hdfs to get back the content.

Meteor package for upload on amazon s3

Which package to use for file uploads to amazon S3 ? knox or collectionFS or any other ?
Is collectionFs ready for file uploads to S3? Or knox is good enough?
Note: in any case i don't want to share my key on the client side. Because of security issue.
2 . Also is there a option where the client file stream can be directly conected to a stream to upload on S3. File is not actually present on the server at any time.
I happen to be looking at this too. CollectionF5 allows file to be send to the server and you can create a handler that sends it to s3 (possibly as blob stream and not a storage). https://github.com/CollectionFS/Meteor-cfs-s3
In the end, you need a place to specify the access key for uploading. If you do not put it in the client side, and you wish the file stream to directory upload on s3, how does the stream get the access?
Possibly you could also take a look at this as alternative: https://www.inkfilepicker.com/

Text file output from Biztalk?

I have a mapping in Biztalk from an external XML schema to an internal XML schema. The internal schema I wish to map to a flat file schema. But that doesent seems to be possible. I cant select the flat file schema as destination schema. So I guess I need to use a flat file assembler in a send pipeline? But on properties on the flat file assembler the flat file schema is not visible under Document Schema.
Do you know how to do this?
I use Biztalk 2009
+1 to what Jay said.
Moreover, You should do the following after step 2:
2.1 Create a new Map that references source schema (external schema) and destination schema (flat file schema created in step 2)
and then you should use this map to transform XML to flat file. This should be done using a send port configured with two things:
pipeline (already mentioned in Jay's response, #3)
Outbound Maps, you can find that in BizTalk Administration console under Send Port properties .. You should configure add the map you created in step 2.1 to the outbound maps.
Typically, you should be receiving the XML file on a receive location. so what you need to do is to create a new receive port, and a receive location. and in the send port properties (again!) add a new filter with the following configuration:
BTS.ReceivePortName == XXXXX
Where XXXXX is the receive port name.
This way, any message is received on receipt port XXXXX is going to be sent to the send port with the map and pipeline already configured.
Create an example flat file that has the formatting you want for your output
Create a flat file schema using the flat file schema wizard. Use your example file as the input to the wizard.
create a pipeline. Put the flat file assembler into the pipeline.
Click on the flat file assembler shape and set the document schema to the flat file schema you created. This tells it the format of the output file.
Create a send port and use the pipeline you created.
Send your data to the send port.

using wcf-sql adapter

I need to poll the data in xml format and map it to the EDI 834.........
I have written the stored procedure using for xml auto,element
when i consume it using add adapter metadata i am getting a xml message....
but i need to use this xml message to map it to the EDI834 ....How to get the structure of xml so that i can use that in map....
I also followed http://social.msdn.microsoft.com/Forums/en-US/biztalkgeneral/thread/6a7e0093-0692-4ba5-9e14-0d2090c2cf54
this thread and generated the schems using xml polling and mapped that to EDI834.
But when i use the map into outbound map...It doesnt map the polling data to edi 834..
The WCF-SQL adapter removes the need to use the 'for xml auto, elements' syntax. This is a legacy leftover from the old Sql Adapter.
Just write your stored procedure in a manner to return a consistent result set, then generate metadata against the stored procedure. The adapter framework will create an appropriate schema based on the metadata returned from your stored procedure.
Then simply map the data from your WCF-SQL schema to your EDI834 schema.
Create the stored procedure that returns xml (or xml part) by using the FOR XML PATH syntax
-Setup a receive location using WCF-SQL. Select XmlPolling. Choose a rootname and namespace for the adapter to wrap around the xml returned from SQL (mandatory).
-Set Polling Statement to: exec [SPNAME]
-Set PollDataAvailableStatement to something appropriate that will return a count > 0 if there are rows/xml to be polled.
-Use passthrureceive pipeline for the receive-location
-Set up a send port (FILE) that subscribes to everything that comes from the receiveport used for the receivelocation.
-Start the application. Examine the XML returned from the adapter.
-In VS generate a schema using well-formed XML (Add->Add generated Items->Generate Schemas) (NOTE: You may have to run the InstallWFX.vbs found under the BizTalk SDK/Utilities/Schema generator, if you have not already done this earlier on the machine).
-Choose the xml file generated by the adapter (give the file a name representing the schema you are trying to create).
-Now you should have a schema representing the xml returned by the adapter, you may have to go through the schema manually and change data types to something more appropriate than what the wizard has chosen.

Resources