I own Object Storage containers at OVH. I found how to copy a file from one container to another:
swift copy -d /"Destination container" "Initial container" "File"
It works, but I want to copy hundreds of files. Do you know how to copy a lot of files that are in a folder ?
Thank you
The OpenStack Swift API doesn't have methods to copy multiple objects. The PUT, GET and COPY methods all operate on single objects.
So you have to loop over a listing of the source container, and copy each file one at a time. You can do this using a shell script and the swift command, or using one of the Swift API libraries and some custom code (e.g. in Python).
Alternatively you could use rclone (link) or potentially cyberduck (link). These tools apparently don't use the COPY method so the copy will entail extra transfer steps and may take longer overall than if you wrote a script to do the job.
Related
Are there more examples of custom build JSON payloads beyond that available at https://www.jfrog.com/confluence/display/RTF/Artifactory+REST+API? Or perhaps more in-depth documentation on the “application/vnd.org.jfrog.build.BuildsByName+json” payload?
We have a build that produces both JAR/IVY and RPM files (and some other file types that Artifactory doesn’t really know the content of). Today, we publish those into a generic repository to keep everything together.
What would be ideal is to be able to create my own custom build using the REST API, composed of the JAR files + RPM files, so I can do licensing searches across them.
In the given example, the artifacts composed in the build are referenced by ID/name/hash for reference.
The problem with the current Jenkins/Artifactory/Gradle plugin that we use is that our build is separated amongst many smaller builds, but ultimately, are released as one. This makes making a full report somewhat difficult, and doesn’t have any way for us to easily do license checks including RPM files. We want to be able to publish one build, which contains everything we know in the build.
The current setup also has us uploading our JARs into a Maven repository, which adds time to the builds, given we are also publishing the same content into the Generic repository alongside the RPMS and other content.
Thanks!
The build info JSON is fully documented in the README of this
repository: https://github.com/JFrogDev/build-info
Which is also the repository the holds the code of the build info
engine used by the various JFrog CI/Build plugins. You can definitely
create your own BI JSON, and, if you're going to use Java to do that,
you should check out this project that demonstrates the usage of the
various build info Java APIs:
https://github.com/JFrogDev/project-examples/tree/master/build-info-java-example
Another option you may want to look into is the JFrog CLI, which
recently brought support for associating artifact
deployment/resolution with a build object and deploying it to
Artifactory. This method is completely agnostic to the file types
your build produces or the build tool you are using. Have a look at
the official documentation here:
https://www.jfrog.com/confluence/display/CLI/CLI+for+JFrog+Artifactory#CLIforJFrogArtifactory-BuildIntegration
Lastly, if you are using Jenkins, the Jenkins Artifactory Plugin now
has Pipeline APIs that will allow you to collect artifacts and build
information programmatically, and even concatenate multiple build info
objects to deploy them as a single build entity to Artifactory,
which is pretty wicked.
Have a read about this here:
https://wiki.jenkins-ci.org/display/JENKINS/Artifactory+-+Working+With+the+Pipeline+Jenkins+Plugin
I would like to issue one command to build both a grails and a flex project (the Flex project can be built with Ant). I have a file, WEB-INF/flex/services-config.xml which needs to be different for the PROD war build and the DEV environment.
I'm thinking of having two files: services-config-PROD.xml and services-config-DEV.xml and then copying the relevant one to services-config.xml whenever a build happens.
So in dev I run 'grails run-app' and it copies the file and runs the app; and for prod I run 'grails war' (or some other command) and it copies the file, creates a war, and also calls the Flex project to build via its Ant build file.
What would be the best way to acheive this, or at least any part of what I'm asking?
Seems overly complicated. Why do you need the services-config? Personally, I never use it, I use code to create my services which can be done dynamically if needed. If you want one for prod or dev, you can all do it within code.
I would imagine that prod and dev is just 2 different server urls? I normally let html pass those urls using FlashVars. That way, both servers can point to exactly the same swf (or different swf versions), but just change that one FlashVar to make the application point at a different location.
I need to know what folder names are commonly used by Version Control systems. Many VCSs will create a hidden folder, usually in the top level of the source tree, to store the information that they use.
So far, I only know that Git uses .git/ and SVN uses .svn/.
What are the folder names that other popular VCSs use?
We could probably divide the VCSs into three groups:
Special Subdirectory in each directory
CVS
Subversion (.svn)
The advantage of this is that each directory in the working copy is a self-contained working copy: you can copy it out somewhere else and it will still work. The obvious disadvantage is the clutter. Using automatic tools to scan over one of these working copies need special filtering or they will return spurious results.
Single special directory for each working copy
Mercurial (.hg)
SVK
(Maybe Git, I'm not sure?)
Special file system support
ClearCase (dynamic view is a mounted FS; snapshot view is more similar to the single directory case)
Mercurial = a single .hg directory
Our organization's custom build tools write out a lot of intermediate data, and I'd like it if Hudson could detect which files were created as part of a build and archive those. I'm not sure if it already does so, but if it does there's no user-visible explanation of it, and certainly deleting a build does not delete its output.
In detail here's what I want. Suppose I start with a bare workspace. After build 1, I have this:
ws/
src/...
obj/
1/...
log/
1/...
pkg/
pkg-1.tgz
Now, I run build 2:
ws/
src/...
obj/
1/...
2/...
log/
1/..
2/..
pkg/
pkg-1.tgz
pkg-2.tgz
The source code is checked out into ws/src each build; there's a custom checkout process, so I can't use the svn RCS method :/.
When I delete a build, I'd like to delete everything that came from that build. Can I do this?
The Hudson way would be to clean up all temporary files at the beginning of the build, and then use Hudson's artifacts archive facility to save the output from each build - specify pkg/**/*.tgz for the "Archive the Artifacts" post build step and then the tgz files will all be copied into the job specific storage area.
The workspace is just a workspace - it's not intended for long term storage.
What I would do, if I were in your shoes, would be to solve the problem a different way. I would write a task or script that specifically deletes everything you don't want to keep, and run that task or script at the end of each job.
Hudson assumes you're cleaning up temp files on your own. If that's not happening, I don't believe Hudson has any facility to help you.
It's difficult to answer without understanding your "custom checkout process", but typically all of the project's build artifacts are created beneath a single directory apart from the src, e.g. "build" or "target".
That way, the project's "clean" target can simply remove that directory, and you can instruct Hudson to include this target for each build.
It appears your build artifacts are contained in multiple directories relative to 'src': 'obj', 'log', and 'pkg'. Can you introduce a "clean" target that will explicitly delete each of those directories?
How would I go about having a CMake buildsystem, which scans for source files now using AUX_SOURCE_DIRECTORY, scan for header files too in the same directory, preferably using a similar command?
I didn't find an easy way to do this in the documentation yet, so I now have a crappy bash script to post-process my (CodeBlocks) project file...
You can use the file(GLOB ... ) command. For example:
set(dir my_search_dir)
file (GLOB headers "${dir}/*.h")
message("My headers: " ${headers})
This command can also recurse, and list your files relative to a given path. See the "file" command entry in the cmake doc.
The documentation to AUX_SOURCE_DIRECTORY suggests that it was not intended to be used that way, so I'd rather doubt that what you're asking is possible. If you want an authoritative answer, you can reach the CMake developers at cmake#cmake.org (they're actually very nice to deal with).
I'd recommend strongly against using wildcards to specify what is included in the build. The build files should specify the exact contents of the libraries, and not depend on what happens to exist in the directory. It may be cumbersome at first (if you're used to wildcards, or IDE's which works the same way), but when you get used to it, you don't want to have it any other way.