How to upload multiple files to a folder in xquery? - xquery

How can I upload multiple files to a folder in xquery ?

Are you trying to iterate through a directory on the filesystem and insert those files at some location in the db, like this?
for $e in xdmp:filesystem-directory('c:\my-files\')/dir:entry
let $file := xdmp:filesystem-file($e/dir:pathname)
let $dest-uri := concat('/dest-path/',$e/dir:filename)
where ($e/dir:type eq 'file')
return xdmp:document-insert($dest-uri,$file)

You should take a look at Information Studio. It provides a UI for loading and transforming content. One of the built-in “collectors” allows you to point at a file system directory and recursively load its contents. Information Studio automatically handles the directory traversal and breaking the load up into multiple transactions, among many other conveniences.

Related

Can I use PowerBI to access SharePoint files, and R to write those files to a local directory (without opening them)?

I have a couple of large .xlsb files in 2FA-protected SharePoint. They refresh periodically, and I'd like to automate the process of pulling them across to a local directory. I can do this in PowerBI already by polling the folder list, filtering to the folder/files that I want, importing them and using an R script to write that to an .rds (it doesn't need to be .rds - any compressed format would do). Here's the code:
let
#"~ Query ~"="",
//Address for the SP folder
SPAddress="https://....sharepoint.com/sites/...",
//Poll the content
Source15 = SharePoint.Files(SPAddress, [ApiVersion=15]),
//... some code to filter the content list down to the 2 .xlsb files I'm interested in - they're listed as nested 'binary' items under column 'Content' within table 'xlsbList'
//R export within an arbitrary 'add column' instruction
ExportRDS = Table.AddColumn(xlsbList, "Export", each R.Execute(
"saveRDS(dataset, file = ""C:/Users/current.user/Desktop/XLSBs/" & [Label] & ".rds"")",[dataset=Excel.Workbook([Content])[Data]{0}]))
However, the files are so large that my login times out before the refresh can complete. I've tried using R's file.copy command instead of saveRDS, to pick up the files as binaries (so PowerBI never has to import them):
R.Execute("file.copy(dataset, ""C:/Users/current.user/Desktop/XLSBs/""),[dataset=[Content]])
with dataset=[Content] instead of dataset=Excel.Workbook([Content])[Data]{0} (which gives me a different error, but in any event would result in the same runtime issues as before) but it tells me The Parameter 'dataset' isn't a Table. Is there a way to reference what PowerBI sees as binary objects, from within nested R (or Python) code so that I can copy them to a local directory without PowerBI importing them as data?
Unfortunately I don't have permissions to set the SharePoint site up for direct access from R/Python, or I'd leave PowerBI out entirely.
Thanks in advance for your help

Grunt-init copyAndProcess function: Can I pass in multiple values to 'noProcess' option?

I'm using grunt-init to build a template for a site structure I repeat regularly.
The template.js file uses the init.copyAndProcess function to customize most of files but a few of them get corrupted by the file processing (some fonts and image files) and I want to include those files in the 'noProcess' option. If these files all existed in the same directory, I could use the noProcess option as mentioned in the documentation [ See: http://gruntjs.com/project-scaffolding#copying-files ] and pass in a string like and it works:
var files = init.filesToCopy(props);
init.copyAndProcess(files, props, {noProcess: 'app/fonts/**'} );
Unfortunately the files that I need to have no processing performed on are not all in the same directory and I'd like to be able to pass in an array of them, something like the following block of code, but this does not work.
var files = init.filesToCopy(props);
init.copyAndProcess(files, props, {noProcess: ['app/fonts/**', 'app/images/*.png', 'app/images/*.jpg']} );
Any thoughts on how I can have multiple targets for the 'noProcess' option?
As soon as I posted the question, I realized that my proposed code did work. I simply had an invalid path when I'd renamed my 'app' directory to 'dev'.

Flex : Create new folder in zip file

I am using Flex, Flash Builder 4.5 and Extension Builder 2.0.0 and I use the "nochump ziplib" library to generate a ZIP file. I want to create a new folder in created ZIP file, but I can't find such function function in the "nochump" library.
Can anyone please tell me if there is any function to add new folder in a ZIP file or a library which can help me do this?
The directories are not first-class citizens in the ZIP format.
The archive is built from "entries" - plain files with their relative locations to the "central directory" (the "root" of the archive). This means that the ZIP file is composed from entries like "pictures/1.jpg", "doc/old/1.txt" etc. You don't have separate entries for the "pictures", "doc" or "doc/old" directories.
You can't create a new directory directly. Instead of creating a new directory first (such as "newDir") you may want to create a file (entry) inside instead (such as "newDir/1.txt") and "newDir" will appear as directory when you open the resulting ZIP file.
If you insist on having an empty directory in the archive, you may try the hacky way - adding entries like "newDir/." with zero length. But this may not work with your library.
The Wikipedia article for the ZIP format has all the theory explained pretty well.

System::IO::Directory::GetFiles in c++

I am having trouble in storing the files in a string array from a directory in c++, using System::IO::Directory::GetFiles in c++
Also would like to know if we could copy an entire folder to a new destination/ in c++ like given in http://www.codeproject.com/KB/files/xdirectorycopy.aspx for c#
You can store the file names from a directory in a managed array like this:
System::String ^path = "c:\\";
cli::array<System::String ^>^ a = System::IO::Directory::GetFiles(path);
Console::WriteLine(a[0]);
Console::ReadKey();
As for how would you copy an entire folder... Simply recurse from a given root directory creating each directory and copying the files to the new location. If you are asking for code for this, then please say so, but at least try to figure it out for yourself first (i.e. show me what you have so far).
Check out the file listing program in Boost::FileSystem: http://www.boost.org/doc/libs/1_41_0/libs/filesystem/example/simple_ls.cpp. They iterate over all files, printing the paths, but it's trivial to store them instead.
Assuming you're on Win32, you're looking for the FindFirstFile and FindNextFile APIs.
C/C++ does not define a standard way to do this, though Boost::Filesystem provides a method if you need cross platform support.

How can I associate many existing files with drupal filefield?

I have many mp3 files stored on my server already from a static website, and we're now moving to drupal. I'm going to create a node for each audio file, but I don't want to have to upload each file again. I'd rather copy the files into the drupal files directory where I want them, and then associate the nodes with the appropriate file.
Any ideas on how to accomplish that?
Thanks!
I am not sure if I am going to propose a different approach or if I am about to tell with different words what you already meant with your original question, but as you want the nodes to be the files, I would rather generate the nodes starting from the files, rather than associating existing nodes with existing files.
In generic terms I would do it programmatically: for each existing files in your import directory I would build the $node object and then invoke node_save($node) to store it in Drupal.
Of course in building the $node object you will need to invoke the API function of the module you are using to manage the files. Here's some sample code I wrote to do a similar task. In this scenario I was attaching a product sheet to a product (a node with additional fields), so...
field_sheet was the CCK field for the product sheet in the product node
product was the node type
$sheet_file was the complete reference (path + filename) to the product sheet file.
So the example:
// Load the CCK field
$field = content_fields('field_sheet', 'product');
// Load the appropriate validators
$validators = array_merge(filefield_widget_upload_validators($field));
// Where do we store the files?
$files_path = filefield_widget_file_path($field);
// Create the file object
$file = field_file_save_file($sheet_file, $validators, $files_path);
// Apply the file to the field, this sets the first file only, could be looped
// if there were more files
$node->field_scheda = array(0 => $file);
// The file has been copied in the appropriate directory, so it can be
// removed from the import directory
unlink($sheet_file);
BTW: if you use a library to read MP3 metadata, you could set $node->title and other attributes in a sensible way.
Hope this helps!
The file_import module doesn't do exactly what you want (it creates node attachments instead of nodes), but it would be relatively simple to use that module as guidance along with the Drapal API to do what you want.

Resources