Cannot collect patch dependency on a local Artifactory Pypi repository - artifactory

While testing out conan, I had to "pip install" it.
As I am working in a fully offline environment, my expectation was that I could simply
Manually deploy all dependencies listed in https://github.com/conan-io/conan/blob/master/conans/requirements.txt to a local Pypi repository called myrepo-python
Install conan with
pip3 install --index http://myserver/artifactory/api/pypi/myrepo-python/simple conan
This works fine for some packages and then fails for the dependency on patch == 1.16
[...]
Collecting patch==1.16 (from conan)
Could not find a version that satisfies the requirement patch==1.16 (from conan) (from versions: )
No matching distribution found for patch==1.16 (from conan)
Looking into the Artifactory logs, this shows that even though I manually deployed patch-1.16.zip (from https://pypi.org/project/patch/1.16/#files) into the repository, it is not present in the index...
The .pypi/simple.html file doesn't contain an entry for 'patch' (checked from the Artifactory UI)
The server logs ($ARTIFACTORY_HOME/logs/artifactory.log) show the file being added to the indexing queue but don't contain a line saying that it got indexed
Does anyone know why patch-1.16.zip is not indexed by Artifactory?
This is on Artifactory 5.8.4.
For now, my only workaround is to gather all the dependencies into a local path and point pip3 at it
scp conan-1.4.4.tar.gz installserver:/tmp/pip_cache
[...]
scp patch-1.16.zip installserver:/tmp/pip_cache
[...]
scp pyparsing-2.2.0-py2.py3-none-any.whl installserver:/tmp/pip_cache
ssh installserver
installserver$ pip3 install --no-index --find-links="/tmp/pip_cache" conan

The reason you are unable to install the "patch" Pypi package via Artifactory is that it does not comply with the Python spec.
Based on Python spec (https://www.python.org/dev/peps/pep-0314/ and https://packaging.python.org/tutorials/packaging-projects/), the structure of a Python package should be, for example:
└── patch-1.16.zip
└── patch-1.16
   ├── PKG-INFO
   ├── __main__.py
   ├── patch.py
   └── setup.py
However, the zip archive (can be found here https://pypi.org/project/patch/1.16/#files) is structured like that:
└── patch-1.16.zip
   ├── PKG-INFO
├── __main__.py
   ├── patch.py
   └── setup.py
Artifactory is searching for the metadata file (PKG-INFO in this case) in a certain pattern (inside any folder). Since the PKG-INFO is in the root of the zip (and not in a folder), it cannot find it, therefore, this package's metadata will not be calculated and it will not appear in the "simple" index file (see the error in artifactory.log). As a result, you are unable to install it with pip.
Workaround:
What you can do is manually changing the structure to the correct one.
Create a folder named patch-1.16 and extract the zip to it. Then, zip the whole folder, so you will get the structure like the example above. Finally, deploy this zip to Artifactory.
This time, the PKG-INFO file will be found, the metadata will be calculated and pip will be able to install it.

Related

Creating Docker Volumes in R-Studio Container

I have been running a docker image that lets you interact with R studio in the browser. Here is the link to the R-Studio Browser Github. This code on Windows 10. Down below are the runtime parameters I ran to start the container:
docker run --rm -p 8787:8787 -e USER=guest -e PASSWORD=guest -v C:\Users\Edge\Desktop\coding\school-courses\Probiblity-Stats\R-code\running:/home/guest/r-docker tmlts/r-tidyverse-h2o
I want the scripts that I create within the R-studio container to also be available outside in on my host machine. So I would like to create a volume for all the files in the class-work directory within the container to be one to one with the \Probability-Stats\R-code\running directory on my host machine. The testing.R is the R files as an example in the directories below. So every time a new change is implemented and saved I want the host machine's/containers R file's contents to also be altered synchronously.
So within R studio container I have the following directories:
Note that the directories r-docker and rstudio came with the image build and class-work directory was manually created by me.
.
├── class-work/
│ └── testing.R
├── r-docker
└── rstudio
Host machine Directory:
.
└── Probability-Stats/
└── R-code/
└── running/
└── testing.R

Deploying firebase functions with local depencies using firebase CLI

Setup
I have a monorepo setup with the following file structure:
├── functions
│ ├── src
│ └── package.json
├── shared
| ├── dist
| ├── src
| └── package.json
├── frontend
| └── ...
└── firebase.json
Approach 1 (failed)
./shared is holding TypeScript classes shared among the ./backend and ./frontend. Ideally, I want to reference the shared lib from the functions/package.json using a symlink to avoid that I have to re-install after every change to my shared code (where most of the functionality resides).
However, this does not work (neither using link:, nor an absolute file: path, nor an relative file: path)
// functions/package.json
...
"dependencies": {
"shared": "file:/home/boern/Desktop/wd/monorepo/shared"
...
}
resulting into an error upon firebase deploy --only functions (error Package "shared" refers to a non-existing file '"home/boern/Desktop/wd/monorepo/shared"'). The library (despite being present in ./functions/node_modules/) was not transferred to the server?
Approach 2 (failed)
Also, setting "functions": {"ignore": []} in firebase.json did not help.
Approach 4 (works, but lacks requirement a) see Goal)
The only thing that DID work, was a proposal by adevine on Github:
// functions/package.json
...
"scripts": {
...
"preinstall": "if [ -d ../shared ]; then npm pack ../shared; fi"
},
"dependencies": {
"shared": "file:./bbshared-1.0.0.tgz"
...
}
Goal
Can someone point out a way to reference a local library in a way that a) ./functions always uses an up-to-date version during development and b) deployment using the stock Firebase CLI succeeds (and not, e.g. using firelink)? Or is this simply not supported yet?
Here's my workaround to make approach 4 work:
rm -rf ./node_modules
yarn cache clean # THIS IS IMPORTANT
yarn install
Run this from the ./functions folder

Flyway Locations Property Not Being Set

Flyway does not seem to be able to resolve classpath, despite the migrations being in there. What am I doing wrong here?
➜ my-project git:(main) ✗ flyway migrate
Flyway Community Edition 7.0.4 by Redgate
ERROR: Unable to resolve location classpath:db/migration.
Database: jdbc:mysql://localhost:3306/adb (MySQL 8.0)
Successfully validated 0 migrations (execution time 00:00.051s)
WARNING: No migrations found. Are your locations set up correctly?
Current version of schema `adb`: << Empty Schema >>
Schema `adb` is up to date. No migration necessary.
flyway.conf
flyway.url=jdbc:mysql://localhost:3306/adb
flyway.user=root
flyway.password=my-secret-pw
flyway.locations=db/migration/
Tree
➜ my-project git:(main) ✗ tree .
.
├── README.md
├── db
│   └── migration
│   ├── V1.0__create_foo_table.sql
│   ├── V2.0__create_bar_table.sql
│   └── V3.0__alter_bar_table.sql
├── flyway.conf
I've also tried an absolute path with no luck
Figured it out - forgot the filesystem: prepend
flyway.locations=filesystem:db/migration/
I experienced another reason for migration scripts not being found with Flyway 7.8.2.
If the Flyway process does not have read access to the script, rather than printing an error, it simply ignores the files exist.
Found this while using the Flyway Docker image within AWS Codepipeline. The image runs the process as a user called 'flyway'. Codepipeline somehow strips the "excess" permissions from the files on upload. So needed to "chmod 0444" files before using Flyway.
Also recommend using the -X flyway option, to list what it is doing.

Symfony 3.4 deployer fails due to permission denied of shared folder

I have developed a webapp based on Symfony3.4. On production it is deployed on a Ubuntu 18.04 Server via deployer (deployer.org).
Everything runs fine so far. The webapp is deployed in /opt/app/prod done by a user that is part of group www-data.
My webapp allows the upload of files. To support this I have added the folder data which stores the uploaded files.
In order to sustain access to the files after another release I have added the data folder to the list of shared folders.
My deploy.php looks as follows:
set('bin_dir', 'bin');
// Symfony console bin
set('bin/console', function () {
return sprintf('{{release_path}}/%s/console', trim(get('bin_dir'), '/'));
});
// Project name
set('application', 'appname');
set('http_user', 'www-data');
set('writable_mode', 'acl');
// Project repository
set('repository', '<MY_GITREPO>');
// [Optional] Allocate tty for git clone. Default value is false.
set('git_tty', true);
// Shared files/dirs between deploys
add('shared_files', []);
add('shared_dirs', ['data']);
// Writable dirs by web server
add('writable_dirs', ['{{release_path}}','data']);
// Hosts
host('prod')
->hostname('<MY_HOST>')
->user('<MY_USER>')
->stage('prod')
->set('deploy_path', '/opt/app/prod/<MY_APPNAME>');
This leads to the following folder structure:
.
├── current -> releases/5
├── releases
│   ├── 2
│   ├── 3
│   ├── 4
│   └── 5
└── shared
├── app
└── data
So everything fine so far - with one exception:
Deployer wants to setfacl the data folder which is not allowed as the files in data belongs to www-data:www-data where deployer tries to change this as .
The command "export SYMFONY_ENV='prod'; cd /opt/app/prod/<MY_APPNAME>/releases/5 && (setfacl -RL -m u:"www-data":rwX -m u:`whoami`:rwX /opt/app/prod/<MY_APPNAME>/releases/5)" failed.
setfacl: /opt/app/prod/<MY_APPNAME>/releases/5/data/child/679/ba7f9641061879554e5cafbd6a3a557b.jpeg: Operation not permitted
I have the impression that I did a mistake in my deployer.php or I missed something.
Has someone an idea what I need to do in order to get my deployment running?
Thanks and best regards

How to install flyway DB migration tool in CentOS?

I am trying to install flyway on a centOS machine.
I have downloaded Flyway command line tar file and extracted it.
I tried to execute some flyway commands but dnt work
it says "-bash: flyway: command not found"
Did I miss anything.
Do I have to install?
I dnt find any tutorials for Installation.
No need to install it, it's simply a shell script with a JRE, the Flyway Java libraries and associated resources.
Sounds like you need to add the location of to the flyway shell script to your PATH variable if you want to run it without being in the directory or specifying the path.
e.g.
If you have extracted flyway-commandline-4.1.2-linux-x64.tar.gz to /opt/flyway/flyway-4.1.2 which looks like:
flyway-4.1.2
├── conf
├── flyway # <---- The shell script
├── lib
└── ...
somewhere in your setup you want that on your PATH
export PATH=$PATH:/opt/flyway/flyway-4.1.2
Note the command line documentation mentions the first two steps as
download the tool and extract it
cd into the extracted directory.

Resources