Flyway Locations Property Not Being Set - flyway

Flyway does not seem to be able to resolve classpath, despite the migrations being in there. What am I doing wrong here?
➜ my-project git:(main) ✗ flyway migrate
Flyway Community Edition 7.0.4 by Redgate
ERROR: Unable to resolve location classpath:db/migration.
Database: jdbc:mysql://localhost:3306/adb (MySQL 8.0)
Successfully validated 0 migrations (execution time 00:00.051s)
WARNING: No migrations found. Are your locations set up correctly?
Current version of schema `adb`: << Empty Schema >>
Schema `adb` is up to date. No migration necessary.
flyway.conf
flyway.url=jdbc:mysql://localhost:3306/adb
flyway.user=root
flyway.password=my-secret-pw
flyway.locations=db/migration/
Tree
➜ my-project git:(main) ✗ tree .
.
├── README.md
├── db
│   └── migration
│   ├── V1.0__create_foo_table.sql
│   ├── V2.0__create_bar_table.sql
│   └── V3.0__alter_bar_table.sql
├── flyway.conf
I've also tried an absolute path with no luck

Figured it out - forgot the filesystem: prepend
flyway.locations=filesystem:db/migration/

I experienced another reason for migration scripts not being found with Flyway 7.8.2.
If the Flyway process does not have read access to the script, rather than printing an error, it simply ignores the files exist.
Found this while using the Flyway Docker image within AWS Codepipeline. The image runs the process as a user called 'flyway'. Codepipeline somehow strips the "excess" permissions from the files on upload. So needed to "chmod 0444" files before using Flyway.
Also recommend using the -X flyway option, to list what it is doing.

Related

Creating Docker Volumes in R-Studio Container

I have been running a docker image that lets you interact with R studio in the browser. Here is the link to the R-Studio Browser Github. This code on Windows 10. Down below are the runtime parameters I ran to start the container:
docker run --rm -p 8787:8787 -e USER=guest -e PASSWORD=guest -v C:\Users\Edge\Desktop\coding\school-courses\Probiblity-Stats\R-code\running:/home/guest/r-docker tmlts/r-tidyverse-h2o
I want the scripts that I create within the R-studio container to also be available outside in on my host machine. So I would like to create a volume for all the files in the class-work directory within the container to be one to one with the \Probability-Stats\R-code\running directory on my host machine. The testing.R is the R files as an example in the directories below. So every time a new change is implemented and saved I want the host machine's/containers R file's contents to also be altered synchronously.
So within R studio container I have the following directories:
Note that the directories r-docker and rstudio came with the image build and class-work directory was manually created by me.
.
├── class-work/
│ └── testing.R
├── r-docker
└── rstudio
Host machine Directory:
.
└── Probability-Stats/
└── R-code/
└── running/
└── testing.R

Symfony 3.4 deployer fails due to permission denied of shared folder

I have developed a webapp based on Symfony3.4. On production it is deployed on a Ubuntu 18.04 Server via deployer (deployer.org).
Everything runs fine so far. The webapp is deployed in /opt/app/prod done by a user that is part of group www-data.
My webapp allows the upload of files. To support this I have added the folder data which stores the uploaded files.
In order to sustain access to the files after another release I have added the data folder to the list of shared folders.
My deploy.php looks as follows:
set('bin_dir', 'bin');
// Symfony console bin
set('bin/console', function () {
return sprintf('{{release_path}}/%s/console', trim(get('bin_dir'), '/'));
});
// Project name
set('application', 'appname');
set('http_user', 'www-data');
set('writable_mode', 'acl');
// Project repository
set('repository', '<MY_GITREPO>');
// [Optional] Allocate tty for git clone. Default value is false.
set('git_tty', true);
// Shared files/dirs between deploys
add('shared_files', []);
add('shared_dirs', ['data']);
// Writable dirs by web server
add('writable_dirs', ['{{release_path}}','data']);
// Hosts
host('prod')
->hostname('<MY_HOST>')
->user('<MY_USER>')
->stage('prod')
->set('deploy_path', '/opt/app/prod/<MY_APPNAME>');
This leads to the following folder structure:
.
├── current -> releases/5
├── releases
│   ├── 2
│   ├── 3
│   ├── 4
│   └── 5
└── shared
├── app
└── data
So everything fine so far - with one exception:
Deployer wants to setfacl the data folder which is not allowed as the files in data belongs to www-data:www-data where deployer tries to change this as .
The command "export SYMFONY_ENV='prod'; cd /opt/app/prod/<MY_APPNAME>/releases/5 && (setfacl -RL -m u:"www-data":rwX -m u:`whoami`:rwX /opt/app/prod/<MY_APPNAME>/releases/5)" failed.
setfacl: /opt/app/prod/<MY_APPNAME>/releases/5/data/child/679/ba7f9641061879554e5cafbd6a3a557b.jpeg: Operation not permitted
I have the impression that I did a mistake in my deployer.php or I missed something.
Has someone an idea what I need to do in order to get my deployment running?
Thanks and best regards

Cannot collect patch dependency on a local Artifactory Pypi repository

While testing out conan, I had to "pip install" it.
As I am working in a fully offline environment, my expectation was that I could simply
Manually deploy all dependencies listed in https://github.com/conan-io/conan/blob/master/conans/requirements.txt to a local Pypi repository called myrepo-python
Install conan with
pip3 install --index http://myserver/artifactory/api/pypi/myrepo-python/simple conan
This works fine for some packages and then fails for the dependency on patch == 1.16
[...]
Collecting patch==1.16 (from conan)
Could not find a version that satisfies the requirement patch==1.16 (from conan) (from versions: )
No matching distribution found for patch==1.16 (from conan)
Looking into the Artifactory logs, this shows that even though I manually deployed patch-1.16.zip (from https://pypi.org/project/patch/1.16/#files) into the repository, it is not present in the index...
The .pypi/simple.html file doesn't contain an entry for 'patch' (checked from the Artifactory UI)
The server logs ($ARTIFACTORY_HOME/logs/artifactory.log) show the file being added to the indexing queue but don't contain a line saying that it got indexed
Does anyone know why patch-1.16.zip is not indexed by Artifactory?
This is on Artifactory 5.8.4.
For now, my only workaround is to gather all the dependencies into a local path and point pip3 at it
scp conan-1.4.4.tar.gz installserver:/tmp/pip_cache
[...]
scp patch-1.16.zip installserver:/tmp/pip_cache
[...]
scp pyparsing-2.2.0-py2.py3-none-any.whl installserver:/tmp/pip_cache
ssh installserver
installserver$ pip3 install --no-index --find-links="/tmp/pip_cache" conan
The reason you are unable to install the "patch" Pypi package via Artifactory is that it does not comply with the Python spec.
Based on Python spec (https://www.python.org/dev/peps/pep-0314/ and https://packaging.python.org/tutorials/packaging-projects/), the structure of a Python package should be, for example:
└── patch-1.16.zip
└── patch-1.16
   ├── PKG-INFO
   ├── __main__.py
   ├── patch.py
   └── setup.py
However, the zip archive (can be found here https://pypi.org/project/patch/1.16/#files) is structured like that:
└── patch-1.16.zip
   ├── PKG-INFO
├── __main__.py
   ├── patch.py
   └── setup.py
Artifactory is searching for the metadata file (PKG-INFO in this case) in a certain pattern (inside any folder). Since the PKG-INFO is in the root of the zip (and not in a folder), it cannot find it, therefore, this package's metadata will not be calculated and it will not appear in the "simple" index file (see the error in artifactory.log). As a result, you are unable to install it with pip.
Workaround:
What you can do is manually changing the structure to the correct one.
Create a folder named patch-1.16 and extract the zip to it. Then, zip the whole folder, so you will get the structure like the example above. Finally, deploy this zip to Artifactory.
This time, the PKG-INFO file will be found, the metadata will be calculated and pip will be able to install it.

Why is docker build prefixing my copy path with a temp directory?

Here is the DockerFile.
FROM microsoft/aspnet:4.7
ARG source
WORKDIR /inetpub/wwwroot
COPY ${source:-obj/Docker/publish} .
And here is the error.
Error
Building a.enterpriseextservices
Service 'a.enterpriseextservices' failed to build: COPY failed:
GetFileAttributesEx \\?\C:\Users\jesmiller-AM\AppData\Local\Temp\docker-
builder587295999\obj\Docker\publish: The system cannot find the file specified..
For more troubleshooting information, go to
http://aka.ms/DockerToolsTroubleshooting docker-compose C:\Program Files
(x86)\Microsoft Visual Studio\2017\Community\MSBuild\Sdks\Microsoft.Docker.Sdk\build\Microsoft.VisualStudio.Docker.Compose.targets 349
I have published the project to the obj/Docker/publish folder.
Here is my docker-compose file. I used the docker-compose up command from the folder where the docker-compose.yml file is located.
version: '3'
services:
a.web.familyconnection:
image: a.web.familyconnection
build:
context: .\FamilyConnection
dockerfile: Dockerfile
b.enterpriseextservices:
image: b.enterpriseextservices
build:
context: .\Framework\b.EnterpriseExtServices
dockerfile: Dockerfile
I had the same issue. Turned out I made a silly mistake. I added the following to my .dockerignore file, just out of habit when setting up a new project:
bin
obj
.vs
.git
Then I tried running this in my Dockerfile
COPY ./bin/publish/ .
Docker gave the strange tmp path error, because it was falling back to that path since I told it to ignore my /bin folder. Once I copied to a different publish path (not bin), the problem went away.
It looks like your path to the folders, or where you've published your code at may be incorrect. The project should be published in the obj/Docker/publish folder inside of the respective folders defined by context
Using an example docker-compose.yml:
version: "3"
services:
foo:
build: ./foo
bar:
build: ./bar
And Dockerfile:
FROM jaydorsey/ruby-2.4.1
COPY ${source:-obj/Docker/publish} .
And a tree structure like this:
.
├── Dockerfile
├── bar
│   └── Dockerfile
├── docker-compose.yml
├── foo
│   └── Dockerfile
└── obj
└── Docker
└── publish
When I run docker-compose build I get the following error
Building foo
Step 1/2 : FROM jaydorsey/ruby-2.4.1
---> b79899b232f6
Step 2/2 : COPY ${source:-obj/Docker/publish} .
ERROR: Service 'foo' failed to build: COPY failed: stat /var/lib/docker/tmp/docker-builder186649811/obj/Docker/publish: no such file or directory
This isn't identical to yours, since I'm running macOS, but very similar. You'll note the temporary file location (which is an internal Docker artifact of how it's copying files around) and the similarity in the docker-build<randomstring> path
However, if I create the obj/Docker/publish folders underneath each respective subfolder (context), the docker-compose build command works fine.
.
├── Dockerfile
├── bar
│   ├── Dockerfile
│   └── obj
│   └── Docker
│   └── publish
├── docker-compose.yml
├── foo
│   ├── Dockerfile
│   └── obj
│   └── Docker
│   └── publish
└── obj
└── Docker
└── publish
Please check that the folder you've published exists under the contexts as noted, and not in the root.
I still believe this is a path issue as noted in the error message. I hope this provides some context that helps you debug the root cause.
Can you please confirm your file & folder layout? I'm fairly certain it's path related because of the error message. I haven't done any Docker for Windows work either but I'd also double-check your default path using the correct slash (forward vs backward)

How to install flyway DB migration tool in CentOS?

I am trying to install flyway on a centOS machine.
I have downloaded Flyway command line tar file and extracted it.
I tried to execute some flyway commands but dnt work
it says "-bash: flyway: command not found"
Did I miss anything.
Do I have to install?
I dnt find any tutorials for Installation.
No need to install it, it's simply a shell script with a JRE, the Flyway Java libraries and associated resources.
Sounds like you need to add the location of to the flyway shell script to your PATH variable if you want to run it without being in the directory or specifying the path.
e.g.
If you have extracted flyway-commandline-4.1.2-linux-x64.tar.gz to /opt/flyway/flyway-4.1.2 which looks like:
flyway-4.1.2
├── conf
├── flyway # <---- The shell script
├── lib
└── ...
somewhere in your setup you want that on your PATH
export PATH=$PATH:/opt/flyway/flyway-4.1.2
Note the command line documentation mentions the first two steps as
download the tool and extract it
cd into the extracted directory.

Resources