How can I create and download a zip file using Actionscript - apache-flex

I'm developing an application using Adobe Flex 4.5 SDK, in which the user would be able to export multiple files bundled in one zip file. I was thinking that I must need to take the following steps in order for performing this task:
Create a temporary folder on the server for the user who requested the download. Since it is an anonymous type of user, I have to read Sate/Session information to identify the user.
Copy all the requested files into the temporary folder on the server
Zip the copied file
Download the zip file from the server to the client machine
I was wondering if anybody knows any best-practice/sample-code for the task
Thanks

The ByteArray class has some methods for compressing, but this is more for data transport, not for packaging up multiple files.
I don't like saying things are impossible, but I will say that this should be done on the server-side. Depending on your server architecture I would suggest sending the binary files to a server script which could package the files for you.
A quick google search for your preferred server-side language and zipping files should give you some sample scripts to get you started.

Related

How to access remote file in GGIR package in R?

I a using GGIR package for accelerometer data analysis. My data is onedrive folder which takes a long time to download. Is there a way I can access the onedrive files directly without downloading to my local machine?
My guess would be that this is not possible. If you're working with Azure there are tools available to connect to OneDrive and download/upload the data which is then processed on a separate instance. I'm guessing the same applies to your local machine, but I'm not intimately familiar with Microsoft's services to be sure.
For example:
By using Azure Logic Apps and the OneDrive connector, you can create automated tasks and workflows to manage your files, including upload, get, delete files, and more. With OneDrive, you can perform these tasks:
Build your workflow by storing files in OneDrive, or update existing files in OneDrive.
Use triggers to start your workflow when a file is created or updated within your OneDrive.
Use actions to create a file, delete a file, and more. For example, when a new Office 365 email is received with an attachment (a trigger), create a new file in OneDrive (an action).
https://learn.microsoft.com/en-us/azure/connectors/connectors-create-api-onedrive

HttpClient.PostAsync for uploading files

I am working on an API that needs to download a file from server A and upload it to server B in the same network. It's for internal use. Each of the files will have multiple versions and will need to be uploaded to server B multiple times and all the versions of the same file will share the same file name. This is my first time dealing with file manipulation so please bare with me if my question sounds ignorant. Can I use HttpClient.PostAsync for the uploading part in this effort? Or can I just use Stream.CopyToAsync if it's ok to just copy over? Thanks!
Stream.CopyToAsync copies stream from one to another inside memory of same server.
In your case, you can use HttpClient.PostAsync, but one the other server there should be some api to receive the stream content and save to disk.

Generating ZIP files in azure blob storage

What is the best method to zip large files present in AZ blob storage and download them to the user in an archive file (zip/rar)
does using azure batch can help ?
currently we implement this functions in a traditionally way , we read stream generate zip file and return the result but this take many resources on the server and time for users.
i'am asking about the best technical and technologies solution (preferred way using Microsoft techs)
There are few ways you can do this **from azure-batch only point of view**: (for the initial part user code should own whatever zip api they use to zip their files but once it is in blob and user want to use in the nodes then there are options mentioned below.)
For initial part of your question I found this which could come handy: https://microsoft.github.io/AzureTipsAndTricks/blog/tip141.html (but this is mainly from idea sake and you will know better + need to design you solution space accordingly)
In option 1 and 3 below you need to make sure you user code handle the unzip or unpacking the zip file. Option 2 is the batch built-in feature for *.zip file both at pool and task level.
Option 1: You could have your *rar or *zip file added as azure batch resource files and then unzip them at the start task level, once resource file is downloaded. Azure Batch Pool Start up task to download resource file from Blob FileShare
Option 2: The best opiton if you have zip but not rar file in the play is this feature named Azure batch applicaiton package link here : https://learn.microsoft.com/en-us/azure/batch/batch-application-packages
The application packages feature of Azure Batch provides easy
management of task applications and their deployment to the compute
nodes in your pool. With application packages, you can upload and
manage multiple versions of the applications your tasks run, including
their supporting files. You can then automatically deploy one or more
of these applications to the compute nodes in your pool.
https://learn.microsoft.com/en-us/azure/batch/batch-application-packages#application-packages
An application package is a .zip file that contains the application binaries and supporting files that are required for your
tasks to run the application. Each application package represents a
specific version of the application.
With regards to the size: refer to the max allowed in blob link in the document above.
Option 3: (Not sure if this will fit your scenario) Long shot for your specific scenario but you could also mount virtual blob to the drive at join pool via mount feature in azure batch and you need to write code at start task or some thing to unzip from the mounted location.
Hope this helps :)

How to Read a .csv file from url

I have successfully developed a Shiny Application for sentiment analysis. When I deployed the application, it was not working. After going through the code I found that my code is referring to the files(emotions.csv.gz,subjectivity.csv.gz) which are located in my system. That could be the reason why the application is not working because on server its not able to find these particular files. Is there be any method to read these files directly from the url where they are present, so that there will be no system dependencies?

SFTP polling using java

My scenario as follows:
One java program is updating some random files to a SFTP location.
My requirement is as soon as a file is uploaded by the previous java program, using java I need to download the file. The files can be of size 100MB. I am searching for some java API which is helpful in this way. Here I even don't know the name of files. But I can keep a regular expression for this. A same file can be uploaded by previous program periodically. Since file size is high I need to wait until the complete file to be uploaded.
I used Jsch to download files, but I am not getting how to poll using jsch.
Polling
All you can do is to keep listing remote directory periodically, until you find a new file. There's no better way with SFTP. For that you obviously use ChannelSftp.ls().
Regarding selecting files matching certain pattern, see:
JSch ChannelSftp.ls - pass match patterns in java
Waiting until the upload is complete
Again, there's no support for this in widespread implementations of SFTP.
For details, see my answer at:
SFTP file lock mechanism.

Resources