Incorrect directory structure created using jfrog CLI - artifactory

I am trying to download the entire folder (as it is with all files and subfolders) from an Artifactory repo to my local folder.
Note - I am using Artifactory Pro cloud version
This how my Artifactory local repo (generic) looks like -
I run the following command in jfrog CLI (used this article as refrence) -
jfrog rt download --include-dirs=true --flat=true --user=XXX --password=XXX --url=https://XXXX.jfrog.io/XXXX --recursive '/support-pack/aem-dispatcher/files/(*)' '{1}'
The files get downloaded however it results in a weird folder structure -
Below is a screenshot of the logs
Notice the additional resume folder under the resume folder. Why is this happening?
I want the exact structure under files folder in Artifactory to be replicated in my local folder.
Please help!

The answer turned out to be much simpler than expected. All I had to do was get rid of --include-dirs=true in my command.
Read this article for more info.

Related

JFrog CLI - Unable to create nested folder while uploading package

I am here uploading a nuget package to a JFrog Artifactory using GitHub actions by setup JFrog using this action.
For that purpose I have tried below command at first - which successfully uploaded the package but at wrong path...
jfrog rt u *.nupkg folder1/folder1.1/folder1.1.1/folder1.1.1.1/
It considered folder1/folder1.1/folder1.1.1/folder1.1.1.1/ as a single folder.
So after going through this answer, I tried having it's value as true / false both but it didn't work and threw an error.
Any suggestion that how can I create a nested folder using jfrog cli ?
Well, the issue never existed.
The fun part is,
JFrog doesn't create an empty directory.
.
When I tried to create a Folder1.2 then it displayed nested hierarchy.

Artifactory: Converting remote repo to local repo

My employer has been misusing Bintray as our binary repository for some time. We are finally moving to Artifactory instead and closing down Bintray. But this seems to be an almost impossible task. There is no way of exporting Bintray repos to a zip. Downloading the repos means manually downloading each file from the UI or through their API. I have tried two approaches for automation:
1) wget for crawling our bintray like this:
wget -e robots=off -o ~/wget.log -w 1 -m -np --user --password "https://.bintray.com"
which yielded all of the files in the repos. But this only solves half the problem. I couldn't find out how to import the files to a repository in artifactory (all the repos are over 100mbs each and therefore can't be uploaded, for some reason).
2) I set the Bintray repos up as remote repositories and enabled Active Replication. That seems to have worked for now. But I don't know if they will be removed when the Bintray account is moved or even if they are stored in Artifactory. Therefore I would like to convert the remote repo to a local repo, to make sure that it is permanently stored in artifactory is there a way of doing this? If so, how?
I'll try to address both of your questions below.
What do you mean you can't upload more than 100mb? Which version of Artifactory are you using? On-prem or SaaS-based installation? How are you trying to upload your files to Artifactory? Have you tried to import the content by using the import feature of Artifactory? (Admin --> Import&Export --> repository Import)
It sounds like you are using the UI for the upload, and if so you can configure the max upload size in Admin --> General Configuration page.
If you mean that you have all of the content from Bintray cached in your remote repository cache in Artifactory just use the "Copy" or "Move" option and move the content to a local repository. This will ensure that all of the content is stored locally.

Using JFrog Artifactory Deployer with a shared UNC path

We are using JFrog Artifactory with TFS 2017 and I am looking to use the JFrog Artifactory Deployer task with my build. Looking to upload artifacts from a shared UNC part. Whilst it works fine when uploading artifacts from local file system, it doesn't work with UNC path. I tried using mapped drive but that didn't work either. Does anyone know a solution for this?
Getting the following error
running 'C:\agent_work\16\a\jfrog.exe' rt upload
'\myshared\drops\BuildName\BuildVersion\**\*.zip' 'ext-repo'
--url=https://aritfactory/artifactory --user=******** --password=******** --props='build.number=2996783;build.name=ArtifactoryUpload' 2017-05-22T15:23:06.5911571Z 2017-05-22T15:23:06.5911571Z
2017-05-22T15:23:06.8240199Z Pinging Artifactory...
2017-05-22T15:23:07.0369535Z Done pinging Artifactory.
2017-05-22T15:23:07.0369535Z Path does not exist:
\myshared\drops\buildName\BuildVersion 2017-05-22T15:23:07.0838234Z
[error]Microsoft.PowerShell.Commands.WriteErrorException: Deployment to Artifactory failed 2017-05-22T15:23:07.0994475Z ##[error]PowerShell
script completed with 1 errors. 2017-05-22T15:23:07.0994475Z
[section]Finishing: JFrogArtifactoryDeployer
You can copy files to a local folder by using Copy Files or Windows Machine File Copy task, then upload artifacts.
Another workaround is that adding a powershell script task in your build definition to map the network drive and then publish to artifacts. I just did a quick test with it and it works. the powershell script I used:
New-PSDrive -Name "G" -PSProvider "FileSystem" -Root "\\UNC\Path"
cd G:\
./jfrog.exe rt upload folder\\file.txt 'example-repo-local' --url=https://xxxxx.jfrog.io/xxxx/example-repo-local/ --user='xxxxx' --password='xxxxx' --props='build.number=001;build.name=BuildName'
Remember to download the "jfrog.exe" and place it in the UNC Path.
I think the easiest solution would be to make a symbolic link of the data folder and copy the contents of the data folder into the symbolic link than restart Artifactory.

How do I restore phabricator if I deleted the files but the database is still intact?

So, I did a stupid rm -rf on the folder where the complete phabricator folder was present.
The whole phabricator database is still intact though.
I cloned the required repos on the same old location:
somewhere/ $ git clone https://github.com/phacility/libphutil.git
somewhere/ $ git clone https://github.com/phacility/arcanist.git
somewhere/ $ git clone https://github.com/phacility/phabricator.git
Apache was already configured during previous install.
I then ran:
./bin/storage upgrade
After which I went to the address which pointed to phabricator folder. Now I get the following error:
1146: Table 'phabricator_user.user_cache' doesn't exist
How do I resolve it? Or in general, what's the best way to reinstall phabricator using the old database?
Thanks
Well, if you still have the database, make a mysqldump from the data (export the db data - you should have this by default - a cron job, running a backup script on another backup machine/usb/hard/cloud)
Do a fresh reinstall on phabricator(EVEN on whole LAMP).
Import the previous backup.sql you did.
After setting the user/passwd/host/port/ in the "path_to_phab/conf/local/local.json" via the command line or simply editing the file, try to run the
./bin/storage upgrade
This should work fine if you have the storage engine set to mysql db (not-recommended). If you have a different storage engine (like hdd) try to restore data reproducing the path to where you have data in phab`s fresh installation conf files along with mysql import.

Is there a URL for the latest snapshot for an artifact in Artifactory?

I would like to make a permalink to the latest snapshot version of an artifact in Artifactory. If we are on 1.0-SNAPSHOT, I would like a URL that downloads the latest 1.0-SNAPSHOT JAR. I can find the latest artifact by locating the artifact on our server at http://hostname/artifactory/libs-snapshot/groupId/artifactId/1.0-SNAPSHOT/. Other than checking the timestamps, I can figure out which one if the latest by opening maven-metadata.xml and matching metadata/versioning/snapshot timestamp and buildNumber with a JAR in the same directory. This could be scripted, but ideally Artifactory already has a way to construct a permalink in this manner. Does Artifactory provide such a URL?
Doing the normal query for the entry with artifactId-1.0-SNAPSHOT.jar in the URL name should return automatically the latest snapshot.
See the doc here
One thing: This is base either on the latest creation date if no pom present, or latest creation of the pom if there are some. Mixing pom and non-pom deployment may results in strange results!
I tried using shell script and it worked for me.
Step1: Get an encrypted password for your user account by clicking on user name or create a common user. Go to using your secure password section in the following link
http://www.jfrog.com/confluence/display/RTF/Centrally+Secure+Passwords
Step 2: In your local machine create a temp folder and type this curl(may be wget for windows) command:
curl -o tmp/foo.jar --user <username>:<encrypted_password> <artifactory_url>/list/libs-snapshot-local/com/search/foo/1.0/foo-1.0-SNAPSHOT.JAR
Your foo.jar in tmp folder is latest version. If we dont give timestamp as like above, it will download latest artifact in that version. Hope this helps!
This might be helpful:
How to download the latest artifact from Artifactory repository?
Although there is no permalink ability in the free version of Artifactory, it can be scripted easily as you suggest. I have provided a quick script to do that in the referenced question.
Hope it helps.
Another portable option is to use the maven command line:
mvn org.apache.maven.plugins:maven-dependency-plugin:2.4:get -DartifactId=[artifactId] -DgroupId=[groupId] -Dversion=[version] -Ddest=[dest file]
This works for me (no search API, just direct artifact URL):
curl -O -J --user <username>:<encrypted_password> http://hostname/artifactory/libs-snapshot/groupId/artifactId/1.0-SNAPSHOT/artifactId-1.0-SNAPSHOT.jar
Basically using 1.0-SNAPSHOT in the artifact name downloads the latest version of 1.0-SNAPSHOT snapshot.

Resources