Rename an artifact during its uploading by Jenkins - artifactory

I'm using jenkins to upload rpm artifacts to general repository.
I would like to rename the rpm file from my_product_1.1.0.0.rpm to my_product.rpm.
I tried to add a
curl -umyUser:myP455w0rd! -T "http://artifactory:8081/../old name" "http://artifactory:8081/../new name"
command for uploading where the source is artifactory repo and the destination is the same repo but with a different file name. It fails "cannot find the source file"
Later, I tried to do it using "Publish Artifacts" field in jenkins:
/drop_folder/ => repo/my_product.rpm
but in this case, artifacts created a folder "my_product.rpm" and uploads the my_product_1.1.0.0.rpm within.
Can it be done in a different way?

Using CLI for Jfrog Artifactory from Jenkins pipeline you can 2 options:
Copying the file with the new name to another repo:
jfrog rt cp "your-artifactory-repo/oldname.extension" your-artifactory-repo/newName.extension
Download the artifact and upload to new repo with the new name (not recommended).

Related

How to Upload Multiple Artifacts to artifactory along with their folder structure using Jfrog CLI

I need to upload all the artifacts generated as part of the build to artifactory based on its folder structure.
Here folder structure is nothing but group id, version and artifact id was mentioned at pom level in case of all the dependencies.
So the expectation is how to mention that folder structure in "jfrog rt upload" CLI command?
Because the folder structure will change for every artifact.
jfrog rt upload --flat=false "${dynamic folder structure}/*" p2-release-local/
As per our expectation artifacts should go like this,
http://<artifactorylink>/<group id>/<artifac id>/<version>/<.jar>
http://<artifactorylink>/<group id>/<artifac id>/<version>/<.pom>
NOTE: Am using freestyle job in jenkins, so i cannot use the jfrog plugin to do this for me. And the plugins which are availalbe in freestyle job also will upload some artifacts even we need to provide group and artifact id for it. so it doesn't seems to be helping.
Please let me know how to could accomplish this expectation or if there is anyother way i could upload these artifacts to JFROG along with folder structure.
If you want to upload a folder and files in it using JFrog CLI, you may visit this confluence page for the complete details.
In short, you may use the below command.
jfrog rt u "root/test/(*)" p2-release-local/test/{1}
To apply the source path pattern for directories as well as files, add the include-dirs flag.
jfrog rt u --include-dirs=true "root/test/(*)" p2-release-local/folder/{1}

Unable to push Repo from "react-native-firebase-starter" template

I cloned this repo to start my own project
https://github.com/invertase/react-native-firebase-starter
I have made some modifications and got it setup for Firebase, however I cannot push or rename the Repository.
I ran npm run rename and renamed the directory. GitHub still seems to think I am trying to push the orginal repository as my own.
When I try to push I get:
Authentication failed. You may not have permission to access the repository or the repository may have been archived...
How can I keep this template/starter and push a copy of it as my own repository?
I have tried removing all of the inessential files from the Repo and pushing that way. I get the following error:
I expected to be able to use the starter as a starter to get a project up and running... Maybe I am missing something super obvious.
I don't see a .git folder within the root of the react-native-firebase-starter template, perhaps this is causing issues with pushing this template since git needs to know where to point to upstream.
Maybe you could try initializing the template to your personal git repository and seeing if this resolves your authentication issue:
Create a new repository on GitHub. To avoid errors, do not initialize the new repository with README, license, or gitignore files.
Initialize the local directory containing the template as a Git repository:
git init
Add the files in your new local repository. This stages them for the first commit:
git add .
Commit the files that you've staged in your local repository:
git commit -m "Initial commit"
At the top of your GitHub repository ,created in step 1, copy the remote repository URL.
Add the URL for the remote repository where your local repository will be pushed:
git remote add origin <remote_repository_url>
Push the changes in your local repository to your upstream repository contained in GitHub:
git push -u origin master
You should now be able to push this starter template into your own GitHub repository and use it as your own project.
As for the npm run rename command: this is a custom npm run script created by the author of this starter template and it simply runs the rename.js file contained within the .bin directory of the template's root directory. All this command does is recursively rename the files contained within this template project to the new name specified by your input, so I don't think this is causing the issue. I suspect once your project has been initialized properly with git the authentication issue will disappear as it will now point upstream to your personal repository.
Hopefully that helps!

jfrog copy directory to another directory

With the jfrog-cli I want to copy "tags/$version" to "released/$version"
$version has subdirectories that I need to keep in the respective format.
I tried :
jfrog rt cp --flat=false repo/tags/$version/* repo/released/
What I get is :
repo/released/tags/$version
Can anyone help?
Artifactory cp/mv is BIZARRE... it does not work like a regular file system move or copy command.
The following moves (within the same repository) everything including subfolders from test/example/libraries/1.0.0 to test/example/libraries/2.0.0 .
jfrog rt mv release-local/test/example/libraries/1.0.0 release-local/2.0.0 --user ${userid} --password ${userpwd} --url https://artifactory
Before
/test
/example
/libraries
/1.0.0
file1.dll
/folder1
file2.dll
After
/test
/example
/libraries
/2.0.0
file1.dll
/folder1
file2.dll
The command appears to take the last element of the path of the target and replace it with the entire path of the source repository less the last element of the path... basically it is like a file rename that includes the path as part of the rename. Note that you do not have to use the recursive flag or the flat flag for it.
In the latest release of JFrog CLI (version 1.23.1) the command to copy files from one repository to another should be similar to what you have.
For example, if I want to copy all versions from a repository libs-snapshot-local/ and path org/jfrog/test/ to a repository called generic-local I can execute the command
jfrog rt cp libs-snapshot-local/org/jfrog/test/ generic-local/
By default, it will recursively copy all items of libs-snapshot-local/org/jfrog/test/ to their new location in generic-local

Artifactory cli - download existing files

I'm using a JFROG cli to download content from an Artifactory. It seems that even though a destination contains same files, cli is trying to download it. If I re-run the command without cleaning the destination folder, I takes the same time.
Is there any option to speedup the process? If destination folder has the same SHA1 file, skip?
Our command (download all folders a* in the repo):
jfrog rt dl --threads=`nproc` repo_name/a*/ $TMP_FOLDER/
JFrog CLI is already skipping download in case of a file existence which is validated using a checksum.
You can see this by setting the environment variable "JFROG_CLI_LOG_LEVEL=DEBUG" and then running same download command again. In the debug log you will see on some files the following line: "File already exists locally" - this means the download was skipped due to a file existence.
The relevant code can be found in GitHub - see the method "downloadFileIfNeeded".
Keep in mind that the CLI still has to get the file info from Artifactory and calculate the local file checksum, so in case of a lot of small files, this won't have a strong effect as on big files download.

How to share/transfer an Atom installation (packages and settings) from one Mac to another?

Is it possible to copy Atom from one Mac to another, including all installed packages, settings etc?
There are several ways to synchronize your settings and packages between Atom installations:
Git: Create a public or private Git repo and store the contents of your local ~/.atom folder in there. Ignore the following files/directories in a .gitignore file:
storage
compile-cache
dev
.npm
.node-gyp
Use a package like sync-settings. This will store your configuration in a GitHub Gist.
Dropbox (or similar): Move your ~/.atom folder to your Dropbox folder and then symlink it from there to its original location. This has the downside of syncing everything in ~/.atom, even the things you could ignore.
Use stars to select your favorite packages. On the Atom web site, create an account and mark your favorite packages with stars. Then use apm stars --install to install all starred packages on any machine. Downside: This only works for packages, not for settings.
More details:
https://discuss.atom.io/t/syncing-settings-packages-between-machines/1385
As a user who uses a dotfile management system such as RCM, I prefer independent config files.
For now, Atom doesn't officially provide a packages.cson file to manage plugins, but as the post Syncing settings & packages between machines mentioned, there is a plugin called package-sync that will generate a packages.cson file for us.
So with the help of package-sync, now I can just sync those mininal config files to have my Atom settings and packages consistent across multiple machines.
This is how to do it (Use ubuntu as an example):
Install Atom, and install package-sync through Edit-->Preferences-->Install as the screen shot shows:
Open your command pallete and type: Create Package List and there will be a packages.cson file under your ~/.atom folder.
Edit the gitignore file:
$ gedit ~/.atom/.gitignore
Make sure the content is:
blob-store
compile-cache
dev
storage
.node-gyp
.npm
.apm
packages/
atom-shell/
This is a screenshot of the .gitignore file:
This makes sure the content downloaded by Atom from the Internet will not get synced to your dotfiles repo.
Move the .atom folder to the dotfile repo:
$ mv ~/.atom ~/dotfiles/tag-atom/atom
Relink the folder:
$ ln -s ~/dotfiles/tag-atom/atom ~/.atom
Or if you have rcm installed:
$ rcup
Now go to another machine, and install Atom and package sync. Update your dotfiles repo, and then Open your Atom command pallete and type: sync
Now your Atom settings will get synced and integrated with the RCM dotilfe management system.
This is the files in my ~/.atom folder that get synced:
I recently built a package that syncs automatically your Atom settings and packages across multiple computers. A little bit like the bookmark synchronization mechanism in Google Chrome. It's called atom-package-sync. Maybe it could fit your needs.
You can sync your packages via package-list.txt file and a simple shell script.
Create the package-list.txt file
apm list --installed --bare > package-list.txt
Install missing packages on another host
BASEDIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
INSTALLEd_PKGS=$(apm list --installed --bare)
for PKG in $(cut -f1 -d# $BASEDIR/package-list.txt); do
grep -q $PKG <<< $INSTALLEd_PKGS || apm install $PKG
done
The .atom folder contains the packages folder, which can be rather huge. Unfortunately OneDrive doesn't allow you to exclude folders, so I went with a git option.
I excluded the packages from git and instead I committed a text file containing my packages (my-packages.txt).
To re-install packages I need to run: apm install --packages-file my-packages.txt.
To generate the my-packages.txt, I need something like this on a Bash shell: ls packages | xargs -n 1 echo | cut -d/ -f1 > my-packages.txt
I sync my Atom settings between Windows, macOS, and Linux machines using Resilio Sync Home. It is free and the files are not saved on the "cloud" (like Dropbox or Gists), but it requires that, at least, two machines are online in order to sync the current settings.
I do not want to sync caches, installation specific settings, et al., I update the .sync/IgnoreList file that is created in the synced directory (i.e., the ~/.atom directory). Unfortunately, you will have to update this on each machine that you sync (ironically, the IgnoreList file is not synced). By default, the file specifies various temporary files to be omitted from syncing, so you'll need to add the following:
## Atom-specific
/packages/node-debugger/debugger.log
\packages\node-debugger\debugger.log
/.apm
\.apm
/.node-gyp
\.node-gyp
/.npm
\.npm
/blob-store
\blob-store
/compile-cache
\compile-cache
/dev
\dev
/recovery
\recovery
/split-diff
\split-diff
/storage
\storage
Some of the omitted directories are package-specific (e.g., split-diff). Because Windows has different path delimiters than other platforms, I need to specify both(!!)
Install Resilio Sync Home on your first machine
Add the .atom directory to Resilio to be synced.
Update its IgnoreList file, as shown above. Save this file for the other machines you want to sync with.
Send a Resilio "Read & Write" link of that folder to the other machines you want to sync with or copy the "Read & Write" key to be used on the other machines. To do this, in Resilio's folder view, click on the .atom folder's menu (vertical dots on the right edge) and select "Copy Read & Write key". Save it for later.
Then on your other machines,
Install Resilio Sync Home
Create .atom/.sync
Copy the IgnoreList from your first machine to that directory
Add the .atom directory to be synced with the other machine. You should add the folder using "Enter key or link," then enter the key you copyed, above.
Wait until syncing is done before opening Atom. The first time will may take a few minutes.
Now I don't need to go around installing/removing packages on every machine, separately!
FYI: Changes to files and directories are saved in .sync/Archive, for some period of time, if you should need to recover them.

Resources