I have a Java Maven project. I'm using liquibase to update DB.
Locally, to update my db, I just run in command line:
mvn liquibase:update
In production environment, I don't have Maven installed.
What I need to achieve is through console, execute a command to run the liquibase scripts in a specific classpath.
Any ideas?
Edit:
Ok.I'm trying to follow this approach. I put in a folder the following items:
liquibase jar
The war containing my application and the liquibase changesets
liquibase.properties: It contains the following:
url=jdbc:jtds:sqlserver://xxxxxxxx:xxxx/xxxxx
username=xxx
password=xxxxx
classpath=war_file.war
changeLogFile=WEB-INF/classes/sql/projectName/liquibase/liquibase.xml
Then, in a console, I execute:
java -jar liquibase-core-3.0.5.jar update
It works! It finds my liquibase.xml file and starts the liquibase update.
BUT, when it refers to liquibase.xml that are inside another jar file included in the lib, it fails, because I included it in the liquibase.xml as:
<include file="../other_module/src/main/resources/sql/projectName/liquibase/liquibase.xml" />
How can I add this "include" without doing "src/main/resources" and make it find this xml?
Running the updateSQL goal on your Dev machine:
mvn liquibase:updateSQL
You can generate a migration script:
└── target
└── liquibase
└── migrate.sql
This is one of my favourite features of liquibase. Sometimes clients insist that all database schema changes must be performed manually by their staff.
Another option is to build an auto-upgrade capability into your application. See the servlet listener
Related
Flyway supports environment variables in config files.
Is there a way to make Flyway load these variables from a file, similarly to what Docker and Node.js with dotenv do?
The content of the .env file is for example:
DB_URL=jdbc:postgresql://localhost:5432/db_name
And flyway.conf:
flyway.url=${DB_URL}
If you are using flyway-maven-plugin, you have 3 ways currently:
Defining flyway properties in POM.xml
eg.
<properties>
<flyway.url>jdbc:h2:mem:public;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE;MODE=MySQL;INIT=CREATE SCHEMA IF NOT EXISTS "public";</flyway.url>
<flyway.user>root</flyway.user>
<flyway.password></flyway.password>
</properties>
Defining your flyway properties in some .env or a .conf file.
mvn -Dflyway.configFiles=src/main/resources/some-env-file.env flyway:migrate
Contents of some-env-file.env:
flyway.url=jdbc:h2:mem:public;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE;MODE=MySQL;INIT=CREATE SCHEMA IF NOT EXISTS "public";
flyway.user=root
flyway.password=
Injecting the environment variables directly during maven goal execution:
mvn -Dflyway.url="jdbc:h2:mem:public;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE;MODE=MySQL;INIT=CREATE SCHEMA IF NOT EXISTS public;" -Dflyway.user=root -Dflyway.password=root flyway:migrate
But if you want to load properties from some file using properties-maven-plugin and make them available as enviroment variables, to be used by your flyway-maven-plugin , then unfortunately that is not working.
Here is the github issue tracking this.
I need to create a self-contained .net core (3.1 in my case) app and to pack & publish using chocolatey so it can be installed and used.
I'm using Azure DevOps and have a feed on my own where I'm supposed to publish the chocolatey package.
The objective is to do this in the build pipeline, so, among other tasks I have:
dotnet publish task: creates the self-contained executable
chocolatey pack task: creates the .nupkg from a very simplistic .nuspec (only mandatory fields) I created.
My current problem is that the .nupkg file created contains always the project files and not the executable generated.
To try and work around it I even made the chocolatey pack's task work directory the same as the dotnet publish's task output one.
What am I missing? Is there another approach?
Azure DevOps: publish self-contained .net Core app with Chocolatey
It depends on whether you include the contained executable file in your .nuspec file.
If we include the contained executable file in the .nuspec file, chocolatey will create the .nupkg include the contained executable file, like:
<files>
<file src="IngestCanonicLtesConsole\ContainedExecutable.exe" target="Tools\ContainedExecutable.exe" />
</files>
We could add this contained executable file in the package:
So, if we are only include the mandatory fields without the <files>contained executable </files>, it will not include the contained executable file.
Besides, we need to include the contained executable file in the .nuspec file, we could change the output of the dotnet build to $(System.DefaultWorkingDirectory)\IngestCanonicLtesConsole, so that we could use the relative path in the .nuspec file.
Please check the document .nuspec reference for some more details.
After a few tests I realized that the chocolatey pack will "pack" all files that in exist in the same folder as the ".nuspec". Not sure this is because I don't set anything on tool.
Basically, my solution was to copy my ".nuspec" file to the folder where my executable was.
I am trying to install flyway on a centOS machine.
I have downloaded Flyway command line tar file and extracted it.
I tried to execute some flyway commands but dnt work
it says "-bash: flyway: command not found"
Did I miss anything.
Do I have to install?
I dnt find any tutorials for Installation.
No need to install it, it's simply a shell script with a JRE, the Flyway Java libraries and associated resources.
Sounds like you need to add the location of to the flyway shell script to your PATH variable if you want to run it without being in the directory or specifying the path.
e.g.
If you have extracted flyway-commandline-4.1.2-linux-x64.tar.gz to /opt/flyway/flyway-4.1.2 which looks like:
flyway-4.1.2
├── conf
├── flyway # <---- The shell script
├── lib
└── ...
somewhere in your setup you want that on your PATH
export PATH=$PATH:/opt/flyway/flyway-4.1.2
Note the command line documentation mentions the first two steps as
download the tool and extract it
cd into the extracted directory.
I am trying to place an updated jar under lib path and removing the old jar. Unfortunately , I see the old logs in oozie console which were present in old jar. For confidential purpose I am unable to show logs here. But I am doing the below steps:
Replacing a jar (mycode.jar) under lib folder which is mentioned in workkflow.xml
Submitted the oozie job using oozie job -oozie http://host -config job.properties -run
When I see logs in console, I could see old jar(older version of mycode.jar) logs even if jar is replaced.
If you are talking about the lib directory in the oozie workflow application then you need not to do anything. The next execution of the workflow will automatically pick the new (updated) jar.
For updating the jars into share lib /user/oozie/share/lib/lib_*/* then after replacing the jar, you need to execute the following command to update the share lib into oozie server.
oozie admin -sharelibupdate
Hope this will help. Thanks.
To make sure issue is same I'll narrate what I was facing:
created a MapReduce JAR and placed it in lib folder.
Ran oozie(MapReduce action) job and picked the JAR as expected and ran fine.
I had some functionality changes in my code(JAR) so I added new log statements to make sure new JAR is being picked. Built the JAR and replaced the old JAR with newly built JAR in lib folder(hdfs)
Ran oozie job again, code from old JAR was executed because new log statements did not show up.
After few search I found following tips:
Clear the Yarn Cache: found this in HortonWorks site(https://community.hortonworks.com/articles/92339/how-to-clear-local-file-cache-and-user-cache-for-y.html) - pasting content below for reference
Short Description:
To use different version jar file with same name, clear cache on all NodeManager hosts to prevent the application using old jar
a. Find out the cache location by checking the value of the yarn.nodemanager.local-dirs property
< property >
< name >yarn.nodemanager.local-dirs< /name>
< value>/hadoop/yarn/local</value>
< /property>
b. Remove filecache and usercache folder located inside the folders that is specified in yarn.nodemanager.local-dirs.
[yarn#node2 ~]$ cd /hadoop/yarn/local/
[yarn#node2 local]$ ls filecache nmPrivate spark_shuffle usercache
[yarn#node2 local]$ rm -rf filecache/ usercache/
c. Restart YARN service.
I was unable to clear cache because I did not have the necessary access. Thus I followed below workaround
Rename the Package or class, since this package/class was written by me, I had the liberty to simply rename the class, thus in oozie when new Class name was looked up, automatically the new functionality was executed.
Option 2 may not be viable for many and the question remains open as to why oozie does not pick New JAR/Class.
Could someone please explain how one uses the premake extensions. I added the eclipse extension in a directory under my premake installation. And in the premake script I added recuire "eclipse".
Running the script with premake5 eclipse, I get an error module "eclipse.lua" not found.
I added the path of the modules directory to my environment variables.
I'm using premake (premake5) on Windows 8.
Thanks
addons need to reside in a folder. You need to create a "eclipse" folder, then copy all the files in it, and the "eclipse" folder should be located where premake can load it (either next the executable or some other place handled through environment variables)
I got this working by adding the full path to the require statement.
require "C:/premake/eclipse/eclipse"
and running the command as premake5 eclipse
Note: This plugin does not generate project files that one can import into Eclipse.