I'm trying to find a way of allowing a user in my Meteor app to click a button and download multiple files that they have access to (that are stored in S3 using the Slingshot package).
My first idea was just to open each file in a new tab in the browser but quickly seeing that some browsers won't allow for multiple new tabs being opened and see them as popups.
I've seen the JSZip package and I think I understand how to create a ZIP file OK using basic text inserts
var zip = new JSZip();
zip.file("Hello.txt", "Hello World\n");
var img = zip.folder("images");
img.file("smile.gif", imgData, {base64: true});
var content = zip.generate({type:"blob"});
but I'm less sure on how to generate a ZIP file of the various S3 urls I pass it.
Does anyone have any pointers either to how to add these remote files to the ZIP or perhaps even allow browsers to allow multiple downloads?
Many thanks
Meteor supports npm packages and you can use s3-zip package for downloading zip of a set of files.
Sample use:
var s3Zip = require('s3-zip');
var fs = require('fs');
var region = 'bucket-region';
var bucket = 'name-of-s3-bucket';
var folder = 'name-of-bucket-folder/';
var file1 = 'Image A.png';
var file2 = 'Image B.png';
var file3 = 'Image C.png';
var file4 = 'Image D.png';
var output = fs.createWriteStream(__dirname + '/use-s3-zip.zip');
s3Zip
.archive({ region: region, bucket: bucket}, folder, [file1, file2, file3, file4])
.pipe(output);
Link to this package: s3-zip
Additional resouces:
Installing npm package in Meteor
Using npm package in Meteor
Using npm packages directly will work for meteor 1.3 and above. For lower versions use this package
Related
I'm triying to download data from my blob container.
First of all, I've used function 'blob_container' to generate a blob container object as follows:
cont<blob_container('https://AccountName.blob.core.windows.net/BlobContainer',key='AccountKey')
Immediately, I've created a data frame to identify properly the path for each file.
list_files_blob<-list_blobs(cont, dir = "path where files are located")
Once I've collected all information,I've used 'multidownload_blob' function to copy that files to local path for local saving.
multidownload_blob(cont,src = list_files_blob$name ,dest = 'path to copy files',overwrite = T)
But I get this error.
Error: 'dest' must contain one name per file in 'src'
I know that there is a lot of files to trasnfer but I don't want to create a directory for each file but unique folder for all of them.
All functions are from AzureStor package.
My R version is 4.1.2 (2021-11-01)
"AzureStor": {
"Package": "AzureStor",
"Version": "3.7.0",
"Source": "Repository",
"Repository": "CRAN"
}
Thank you in advance.
Borja
Finally,I've found an answer.
First of all,I've removed dest parameter from multidownload_blob function.
So, I've put all json files in the same folder as the working directory.
Once I've all the available information, I've used the following expression to create a new data frame (df)which is containing all the json files.
df<-fs::dir_ls(path=getwd(),regexp = "json")%>%
map_df(fromJSON, .id = "ID",flatten=T)
I hope this works for all of us.
Borja
So I am following the guide here which indicates the way to access photos is as follows:
flags <- c(
system.file("img", "flag", "au.png", package = "ggpattern"),
system.file("img", "flag", "dk.png", package = "ggpattern")
)
My goal is to now use this code for my own uses, so I saved a few images in a folder. Here is my directory:
"C:/Users/Thom/Docs/Misc/Testy"
And within the Testy folder, there is a folder called image, holding 3 images. But the following doesn't seem to work and idk why...
images <- c(
system.file("image", "image1.png", package = "ggpattern"),
system.file("image", "image2.png", package = "ggpattern")
)
system.file is for use when a file included in a package. Basically, it will look for the file starting its search path to where your R packages are installed (because this can vary from user to user). system.file will return the resolved path to the file locally
If you already know the absolute path on your local computer (i.e. "C:/Users/Thom/Docs/Misc/Testy") you can use that as just the input to a read function, e.g. readBin("C:/Users/Thom/Docs/Misc/Testy")
If you want to get a little fancy or are like me and can't ever remember which direction of a slash to use on which OS, you can also do something like this which will add in the OS specific path separator:
readBin(file.path("C:", "Users", "Thom", "Docs", "Misc", "Testy"))
Is there a better way for programmers within a group, looking to share a common style for shiny apps or rmarkdown docs, to access a css file from a single location rather than manually copying the desired file into the subcontents of each app or document?
The ideal outcome would be to place the file(s) on a github repo, then attach it to any shiny app or rmarkdown file with its web link, is this possible?
Thanks.
It might be easier to include your stylesheets in an R package. This would eliminate the need for external request each time your app loads or doc is opened. Place your css files in a folder in inst and write a single R function that sets the resource paths to your css and loads files accordingly.
Let's say your package has the following structure. (For this example, I'm naming the package mypkg)
mypkg/
R/
use_stylesheets.R
inst/
stylesheets/
styles.min.css
...
...
In use_stylesheets.R, create a function that loads the stylesheets into the head of the document (tags$head) using tags$link.
use_stylesheets <- function() {
shiny::addResourcePath(
"styles",
system.file("stylesheets", package = "mypkg")
)
shiny::tags$head(
shiny::tags$link(rel = "stylesheet", href = "styles.min.css")
)
}
Then in your app or Rmarkdown document, you can load the files into your app using: mypkg::use_stylesheets()
After some tinkering I was able to have success (waiting on coworkers to test on their machines) using these structures for adding css or html from a package into both Shiny Apps and Rmarkdown documents.
# For attaching a css file to an Rmarkdown file
use_package_style_rmd <- function(css_file = "my_css.css"){
# css file
file_name <- paste0("/", css_file)
file_path <- paste0(system.file("stylesheets", package = "my_pkg"), file_name)
shiny::includeCSS(path = file_path)
}
# For placing an HTML headers/footers in place
use_package_header <- function(html_file = "my_header.html"){
# HTML Header
file_name <- paste0("/", html_file)
file_path <- paste0(system.file("stylesheets", package = "my_pkg"), file_name)
shiny::includeHTML(path = file_path)
}
# For attaching css to a shiny app via resourcePath
use_package_style_shiny <- function(stylesheet = "my_css.css"){
# Add Resource Path
shiny::addResourcePath(
prefix = "styles",
directoryPath = system.file("stylesheets", package = "my_pkg"))
# Link to App
shiny::tags$head(
shiny::tags$link(rel = "stylesheet",
href = paste0("styles/", stylesheet)))
}
The use_package_style_rmd function can be placed in any code chunk, whereas the header function will add the html in-place where the function is run.
For the Shiny app use-case the function should be run to establish the resource path to the folder, then in the UI under fluidpage the theme option can be set to the css file using the prefix for the Resource Path styles/my_css.css.
One issue I am still having which may be a separate question, is where to place a supporting image file, and how to add the relative path such that the image can be placed in the header or footer.
I have a directory with subdirectories and many files that need to be pushed to Amazon S3. I am using the 'R' tool.
Is there a clean/easy way to say "push this directory and everything in it up to S3"? I am hoping to avoid pushing things up one at a time, and manually re-building the directory structures.
If you pass file names to put_object() using their full path names and then use those path names as their object keys, then you can implicitly recreate a directory structure. Basically like this (though you may want to change the filenames when using them as object keys in some way):
library("aws.s3")
lapply(dir(full.names = TRUE, recursive = TRUE), function(filename) {
put_object(file = filename, object = filename, bucket = "mybucket")
})
There is also an experimental function s3sync() that should do this for a complete file tree (but it isn't widely tested):
s3sync()
These are the steps I am trying to achieve:
Upload a PDF document on the server.
Convert the PDF document to a set of images using GhostScript (every page is converted to an image).
Send the collection of images back to the client.
So far, I am interested in #2.
First, I downloaded both gswin32c.exe and gsdll32.dll and managed to manually convert a PDF to a collection of images(I opened cmd and run the command bellow):
gswin32c.exe -dSAFER -dBATCH -dNOPAUSE -sDEVICE=jpeg -r150 -dTextAlphaBits=4 -dGraphicsAlphaBits=4 -dMaxStripSize=8192 -sOutputFile=image_%d.jpg somepdf.pdf
Then I thought, I'll put gswin32c.exe and gsdll32.dll into ClientBin of my web project, and run the .exe via a Process.
System.Diagnostics.Process process1 = new System.Diagnostics.Process();
process1.StartInfo.WorkingDirectory = Request.MapPath("~/");
process1.StartInfo.FileName = Request.MapPath("ClientBin/gswin32c.exe");
process1.StartInfo.Arguments = "-dSAFER -dBATCH -dNOPAUSE -sDEVICE=jpeg -r150 -dTextAlphaBits=4 -dGraphicsAlphaBits=4 -dMaxStripSize=8192 -sOutputFile=image_%d.jpg somepdf.pdf"
process1.Start();
Unfortunately, nothing was output in ClientBin. Anyone got an idea why? Any recommendation will be highly appreciated.
I've tried your code and it seem to be working fine. I would recommend checking following things:
verify if your somepdf.pdf is in the working folder of the gs process or specify the full path to the file in the command line. It would also be useful to see ghostscript's output by doing something like this:
....
process1.StartInfo.RedirectStandardOutput = true;
process1.StartInfo.UseShellExecute = false;
process1.Start();
// read output
string output = process1.StandardOutput.ReadToEnd();
...
process1.WaitForExit();
...
if gs can't find your file you would get an "Error: /undefinedfilename in (somepdf.pdf)" in the output stream.
another suggestion is that you proceed with your script without waiting for the gs process to finish and generate resulting image_N.jpg files. I guess adding process1.WaitForExit should solve the issue.