I have a requirment to zip multiple folders inside parent folder and display the file in App Engine ouput. The folder structure in Unix File Server -
Parent Folder
- Folder1 (contains files)
- Folder2 (contains files)
How to zip the folders and store it in parent folder using PeopleCode in AE (Final folder structure will be as follows
Parent Folder
-Folder1
-Folder2
-ParentFolder.Zip.
Note: Process runs on Unix Server.
Actually we were calling java code to zip files.
Such as:
&buffer = CreateJavaArray("byte[]", 18024);
&zipStream = CreateJavaObject("java.util.zip.ZipOutputStream", CreateJavaObject("java.io.FileOutputStream", &outDir | &outZip));
For &i = 1 To &inFiles.Len
&zipStream.putNextEntry(CreateJavaObject("java.util.zip.ZipEntry", &inFiles [&i]));
&inStream = CreateJavaObject("java.io.FileInputStream", &outDir | &inFiles [&i]);
&len = &inStream.read(&buffer);
While &len > 0;
&zipStream.write(&buffer, 0, &len);
&len = &inStream.read(&buffer);
End-While;
&zipStream.closeEntry();
&inStream.close();
End-For;
&zipStream.close();
Related
I am looking a way where I can upload files with in a nested folder using R Programming
I tried below functions which uploads either file or folder from aws.s3 package
library("aws.s3")
put_object("pathoftheobject", object = "filename", bucket = "bucketname")
put_folder("foldername", bucket = "bucketname")
Folder Structure and Files :
ParentFolder
SubFolder1
File1
File2
SubFoler2
File3
File4
............
SubFoldern
Filen
Any guidance here will be really useful.
I'm not an R developer but in C# instead of creating folders under buckets, I use filename with / so S3 recognizes that as sub-folder.
my file name when uploading:
ParentFolder/SubFolder1/Fil1
ParentFolder/SubFolder1/Fil2
ParentFolder/SubFolder2/Fil3
aws.s3::s3sync function does the needful.
library(aws.s3)
Sys.setenv("AWS_ACCESS_KEY_ID" = access_key_id,
"AWS_SECRET_ACCESS_KEY" = secret_access_key,
"AWS_DEFAULT_REGION" = "eu-central-1",
"AWS_SESSION_TOKEN" = session_token)
s3sync(files =dir(paste0(getwd(), "/Folder1Name/","Folder2Name"),recursive = T),
bucket = "BucketName", direction = "upload",verbose = TRUE)
I have the following directory on my laptop:
/tmp/
myapp/
assets/
config.yml
models/
troll.ply
tree.ply
textures/
troll-skin.png
tree-skin.png
I would like to zip /tmp/myapp/assets (and all its recursive contents) up into a ZIP named assets.zip, such that, when I unzip it (via unzip assets.zip), it preserves the directory structure under the assets folder. Hence, when unzipped, it would show config.yml in the "root" of the ZIP, and 2 directories inside the ZIP (models and textures). The rest of the files would be inside these respective subdirectories, etc.
When I run this code:
File sourceDir = new File("/tmp/myapp/assets");
ZipOutputStream zip = new ZipOutputStream(new FileOutputStream("/Users/myuser/archives/assets.zip"));
File[] contents = sourceDir.listFiles();
for(File file : contents) {
zip.putNextEntry(new ZipEntry(file.name));
InputStream isteam = new FileInputStream(file);
Files.copy(isteam, zip);
zip.closeEntry();
isteam.close();
}
zip.close();
The code correctly creates a ZIP at /Users/myuser/archives/assets.zip.
However, when I unzip it (unzip /Users/myuser/archives/assets.zip) and then run ls -al /Users/myuser/archives, my output is:
-rw-r--r-- 1 myuser 1754083733 492 Dec 30 14:14 assets.zip
-rw-r--r-- 1 myuser 1754083733 10 Dec 30 14:14 config.yml
-rw-r--r-- 1 myuser 1754083733 7 Dec 30 14:14 models
-rw-r--r-- 1 myuser 1754083733 9 Dec 30 14:14 textures
So both models and textures are being treated like files (not as directories). Furthermore, when I take a peek at the contents of the "models file", it appears that the contents of troll.ply and tree.ply have been concatenated inside of it, and ditto for the "tree file" with the 2 PNGs.
How can I tweak this so that directory structure (no matter how deep/nested) is always preserved in the resultant ZIP?
you can probably use the recursive method call to preserve the sub directories structure:
private static void addDir(File sourceDir, ZipOutputStream zip) throws IOException {
File[] contents = sourceDir.listFiles();
for(File file : contents) {
if(file.isDirectory()){
addDir(file, zip);
} else {
zip.putNextEntry(new ZipEntry(file.getAbsolutePath().replace("/tmp/myapp/","")));
System.out.println("file name " + file.getAbsolutePath().replace("/tmp/myapp/",""));
Path rn_demo = Paths.get(String.valueOf(file));
Files.copy(rn_demo, zip);
}
}
zip.closeEntry();
}
and you call in main method as below:
public static void main(String[] args) throws IOException {
File sourceDir = new File("/tmp/myapp/assets");
ZipOutputStream zip = new ZipOutputStream(new FileOutputStream("/Users/myuser/archives/assets.zip"));
addDir(sourceDir, zip);
zip.close();
}
Zipping through Java Zip seems to work differently on different OS's. I had the issue that it was working fine on Windows 7. But on Linux (RHEL6) the files were before the folders. This caused tests to fail.
A way to solve it is to sort the files and folders. folders first and then files. So the..
File[] contents = sourceDir.listFiles();
...File array should be sorted via path. Create a List<> from the Files and sort.
Collections.sort(newFiles, (a, b) ->
b.getAbsolutePath().compareTo(a.getAbsolutePath())
);
Note, I created an InputFile object to store the absolute path of the file.
Recently I've changed from CMake to Premake (v5.0.0-alpha8) and I'm not quite sure how to achieve the the following in Premake.
I want to include some dependencies so in CMake I can do something like this:
target_link_libraries(${PROJECT_NAME}
${YALLA_ABS_PLATFORM}
${YALLA_LIBRARY})
The above will add the paths of these libraries (dir) to "Additional Include Directories" in the compiler and it will also add an entry (lib) to "Additional Dependencies" in the linker so I don't need to do anything special beyond calling target_link_libraries.
So I expected that when I'm doing something like this in Premake:
links {
YALLA_LIBRARY
}
I'd get the same result but I don't.
I also tried to use the libdirs but it doesn't really work, I mean I can't see the library directory and its subdirectories passed to the compiler as "Additional Include Directories" (/I) or Yalla.Library.lib passed to the the linker as "Additional Dependencies".
Here is the directory structure I use:
.
|-- src
| |-- launcher
| |-- library
| | `-- utils
| `-- platform
| |-- abstract
| `-- win32
`-- tests
`-- platform
`-- win32
The library dir is defined in Premake as follow:
project(YALLA_LIBRARY)
kind "SharedLib"
files {
"utils/string-converter.hpp",
"utils/string-converter.cpp",
"defines.hpp"
}
The platform dir is defined in Premake as follow:
project(YALLA_PLATFORM)
kind "SharedLib"
includedirs "abstract"
links {
YALLA_LIBRARY
}
if os.get() == "windows" then
include "win32"
else
return -- OS NOT SUPPORTED
end
The win32 dir is defined in Premake as follow:
files {
"event-loop.cpp",
"win32-exception.cpp",
"win32-exception.hpp",
"win32-window.cpp",
"win32-window.hpp",
"window.cpp"
}
And finally at the root dir I have the following Premake file:
PROJECT_NAME = "Yalla"
-- Sets global constants that represents the projects' names
YALLA_LAUNCHER = PROJECT_NAME .. ".Launcher"
YALLA_LIBRARY = PROJECT_NAME .. ".Library"
YALLA_ABS_PLATFORM = PROJECT_NAME .. ".AbstractPlatform"
YALLA_PLATFORM = PROJECT_NAME .. ".Platform"
workspace(PROJECT_NAME)
configurations { "Release", "Debug" }
flags { "Unicode" }
startproject ( YALLA_LAUNCHER )
location ( "../lua_build" )
include "src/launcher"
include "src/library"
include "src/platform"
I'm probably misunderstanding how Premake works due to lack of experience with it.
I solved it by creating a new global function and named it includedeps.
function includedeps(workspace, ...)
local workspace = premake.global.getWorkspace(workspace)
local args = { ... }
local args_count = select("#", ...)
local func = select(args_count, ...)
if type(func) == "function" then
args_count = args_count - 1
args = table.remove(args, args_count)
else
func = nil
end
for i = 1, args_count do
local projectName = select(i, ...)
local project = premake.workspace.findproject(workspace, projectName)
if project then
local topIncludeDir, dirs = path.getdirectory(project.script)
if func then
dirs = func(topIncludeDir)
else
dirs = os.matchdirs(topIncludeDir .. "/**")
table.insert(dirs, topIncludeDir)
end
includedirs(dirs)
if premake.project.iscpp(project) then
libdirs(dirs)
end
links(args)
else
error(string.format("project '%s' does not exist.", projectName), 3)
end
end
end
Usage:
includedeps(PROJECT_NAME, YALLA_LIBRARY)
or
includedeps(PROJECT_NAME, YALLA_PLATFORM, function(topIncludeDir)
return { path.join(topIncludeDir, "win32") }
end)
Update:
For this to work properly you need to make sure that when you include the dependencies they are included by their dependency order and not by the order of the directory structure.
So for example if I have the following dependency graph launcher --> platform --> library then I'll have to include them in the following order.
include "src/library"
include "src/platform"
include "src/launcher"
As opposed to the directory structure that in my case is as follow:
src/launcher
src/library
src/platform
If you will include them by their directory structure it will fail and tell you that "The project 'Yalla.Platform' does not exist."
here's what I am trying to do.
I have a few hundred users My Documents folders in which most(not all) have a file(key.shk for shortkeys program).
I need to upgrade the software but doing so makes changes to the original file.
I would like to run a batch file on the server to find the files in each My Docs folder and make a copy of it there called backup.shk
I can then use this for roll back.
The folder structure looks like this
userA\mydocs
userB\mydocs
userC\mydocs
My tools are xcopy, robocopy or powershell
Thanks in advance
This powershell script works... save as .ps1
Function GET-SPLITFILENAME ($FullPathName) {
$PIECES=$FullPathName.split(“\”)
$NUMBEROFPIECES=$PIECES.Count
$FILENAME=$PIECES[$NumberOfPieces-1]
$DIRECTORYPATH=$FullPathName.Trim($FILENAME)
$baseName = [System.IO.Path]::GetFileNameWithoutExtension($_.fullname)
$FILENAME = [System.IO.Path]::GetFileNameWithoutExtension($_.fullname)
return $FILENAME, $DIRECTORYPATH
}
$Directory = "\\PSFS03\MyDocs$\Abbojo\Insight Software"
Get-ChildItem $Directory -Recurse | where{$_.extension -eq ".txt"} | % {
$details = GET-SPLITFILENAME($_.fullname)
$name = $details[0]
$path = $details[1]
copy $_.fullname $path$name"_backup".txt
}
I want to develop a script that copies,verifies, and then deletes from one network location to another (files over x days old).
Here is my algorithm:
Recursively traverse a network location ($movePath)
for all files $_.LastWriteTime >= x days | forEach {
xcopy or robocopy $FileName = $_.FullName.Replace($movePath, $newPath)
if (the files where written correctly) {
(delete) Remove-Item $Filename from $movePath
}
Can I combine the xcopy /v (verify) with robocopy?
Do you want to maintain the subfolder structure (i.e. files from a subfolder in the source go into the same subfolder in the destination)? If so, this should suffice:
$src = 'D:\source\folder'
$dst = '\\server\share'
$age = 10 # days
robocopy $src $dst /e /move /minage:$age
robocopy can handle verification (done automatically) and deletion by itself.