Does artifactory home directory need to be world-readable? - artifactory

We operate Artifactory on a RHEL 7 Linux server. It runs as the local user artifactory. His home directory is /var/opt/jfrog/artifactory and its mode is 755. Our security folks would prefer it to be 750. Does this directory need to be world-readable or can we restrict it?

I installed Artifactory and changed the home directory and all its subdirectories to be "chmod 750". After some testing, I did not encounter any issues with this change. I successfully started Artifactory and performed some basic tests such as creating a few repositories, changing some configuration, uploading and downloading artifacts, creating users and permissions, etc. All of my tests were successful and it seems that you can change the mode of the Artifactory home to 750 without any issues.

Related

Artifactory to Artifactory remote repos

i have 2 Artifactory server and have configured ubuntu repos.
One off the artifactorys goes to internet to ubuntu and the other
artifactory connects to the artifactory with internet access.
I have the following problem. From my local artifactory i get always the error 404.
I cant fetch the metafile (Packages) from the ubuntu repo.
But if i reconfigure my remote repo and set --> store artifacts locally all seems fine.
I want store the artifacts locally. My local artifactory should ask the artifactory with internet access and should get all files from the remote artifactory.
Have anyone a idea how to solve my problem?
best regards
I assume you are trying to set up a smart remote repository with Artifactory. Refer to this wiki and set up the smart remote repository, basically, you should be adding the URL in your local Artifactory's remote repository as http://ARTIFACTORY_URL/ubuntu-remote/ and make sure the "store artifacts locally"is checked so that this remote repository can able to index the artifacts.

Migrated JFrog from On-premise to AWS Cloud but my artifacts are showing only 40 where as we used to have 200k+ artifacts

I am trying to migrate our on-premise JFrog artifactory to AWS cloud. I have followed the "Method-1" instructions.
Even I have mounted a file system and moved the filestore content to that location and set it as Home directory.
But, still I am only able to see 40 artifacts showing. Not sure what is causing the issue.

Is there anyway to manually cache remote artifacts in Artifactory?

I am playing with a brand new OOS artifactory installation. I have a remote repository for jcenter. I am looking for a way to manually cache remote artifacts to jcenter-cach with artifactory web UI. I appreciate any help.
Edit remote Repository
Repository Brower
When your build is resolving artifacts from artifactory , Artifactory will cache artifacts that were requested from remote repositories.
But since you are looking for a way to do this manually with Artifactory Web UI , it can be don't even though it is little bit tricky.( NB: Remote Web search capability should be enabled in your Artifactory instance). Here are the steps to do so:-
Click Artifacts tab
Select Remote Search (this will open a window for search artifacts from bintray’s remote jcenter repository )
Find the artifact you wants to cache
When you find the artifact, click it , a pop up will appear and select Download.
This will download the file to your local machine and at the same time it will trigger Artifactory to catch the same artifact to remote cache.
NB: Be advised that caching artifacts may take a while depending on the file size or internet connection speed.

deploying wabapps (e.g. moodle) on heroku

What is the proper way to deploy webapps on Heroku? I'm installing Moodle, but the same procedure should apply to e.g. Drupal or Wordpress. What I hace done is to unzip Moodle locally, then uploaded it using git to Heroku. When I then visit my site I get the option to install it and select the database, which works fine. The problem is that the install procedure saves information in the filesystem on the server, which gets overwritten next time I deploy my app. So what is the proper way of doing this?
You have to pre-configure your app with all of the database settings before you deploy to Heroku. So either do a fake "install" on your local environment, or manually edit your php config files.
As you've discovered, Heroku's filesystem is not persistent: https://devcenter.heroku.com/articles/dynos#ephemeral-filesystem.

syncing local development branch with remote branch

How do people usually work with managing remote stuff with local stuff? Say I have an EC2 instance running ubuntu and my dev machine is an OS X. I have a symfony2 project in it. Do people usually work with files directly on the remote server on EC2? If yes how do they use a text editor such as sublime text on an EC2 box?
I cannot say how "people" work. But to me the best practice is following:
Remote:
Install mysql & php incl. all extensions are needed. (symfony-project/web/config.php)
You could use this tutorial: http://www.howtoforge.com/perfect-server-ubuntu-10.04-lucid-lynx-ispconfig-3
Create Database for your project on server
Clone your git repository on server and define WebHook URLs
Create the file on your server. <?php git pull ?>
You'll never work on remote directly from now on! (except installing vendors by ssh)
Local:
Install local webserver. I recommend http://php-osx.liip.ch/ But you can also use MAMP (don't like that).
Clone git repository in web folder
Define remote Database in symfony project
See changes local
Commit&Push will pull remote automatically

Resources