Flyway output to SQL File - flyway

Is it possible to output the db migration to an SQL file instead of directly invoking database changes in flyway?

Most times this will not be needed as with Flyway the DB migrations themselves will already be written in SQL.

Yes it's possible and as far as I am concerned the feature is an absolute must for DBAs who don't want to allow flyway in prod.
I made do with modifying code from here, it's a dry run command for flyway, you can add a filewriter and write out migrationDetails:
https://github.com/killbill/killbill/commit/996a3d5fd096525689dced825eac7a95a8a7817e
I did it like so... Project structure (just copied it out of killbill's project and renamed package to flywaydr:
.
./main
./main/java
./main/java/com
./main/java/com/flywaydr
./main/java/com/flywaydr/CapturingMetaDataTable.java
./main/java/com/flywaydr/CapturingSqlMigrationExecutor.java
./main/java/com/flywaydr/DbMigrateWithDryRun.java
./main/java/com/flywaydr/MigrationInfoCallback.java
./main/java/com/flywaydr/Migrator.java
./main/java/org
./main/java/org/flywaydb
./main/java/org/flywaydb/core
./main/java/org/flywaydb/core/FlywayWithDryRun.java
In Migrator.java add (implement callback and put it in DbMigrateWithDryRun.java) :
} else if ("dryRunMigrate".equals(operation)) {
MigrationInfoCallback mcb = new MigrationInfoCallback();
flyway.dryRunMigrate();
MigrationInfoImpl[] migrationDetails = mcb.getPendingMigrationDetails();
if(migrationDetails.length>0){
writeMasterScriptToFile(migrationDetails);
}
}
Then to write stuff to file something like:
private static void writeMasterScriptToFile(MigrationInfoImpl[] migrationDetails){
FileWriter fw = null;
try{
String masterScriptLoc="path/to/file";
fw = new FileWriter(masterScriptLoc);
LOG.info("Writing output to " + masterScriptLoc);
for (final MigrationInfoImpl migration : migrationDetails){
Path file =Paths.get(migration.getResolvedMigration().getPhysicalLocation());
//if you want to copy actual script files parsed by flyway
Files.copy(file, Paths.get(new StringBuilder(scriptspathloc).append(File.separator).append(file.getFileName().toString()).toString()), REPLACE_EXISTING);
}
//or just get the sql
for (final SqlStatement sqlStatement : sqlStatements) {
//sqlStatement.getSql();
}
fw.write(stuff.toString());
} catch(Exception e){
LOG.error("Could not write to file, io exception was thrown.",e);
} finally{
try{fw.close();}catch(Exception e){LOG.error("Could not close file writer.",e);}
}
}
One last thing to mention, I compile and package this into a jar "with dependencies" (aka fatjar) via maven (google assembly plugin + jar with dependencies) and run it via command like below or you can include it as a dependency and call it via mvn exec:exec goal, which is something I had success with as well.
$ java -jar /path/to/flywaydr-fatjar.jar dryRunMigrate -regular.flyway.configs -etc -etc

I didnt find a way. Switched to mybatis migration. Looks quite nice.

Related

c++builder FDConnection to SQLite after compile the exe

I created a Script with c++Builder 11 with Datas stored in a sqlite3 db File.
To Connect to the Sqlite3.db, i used the FireDAC-Connection inside c++builder and all works fine.
In the Connection-Manager (DataExplorer) of C++Builder i set the Path to "D:\TEST.db".
If i share the Compiled Exe to another PC, then its possible there is no Drive D.
I tried in Connection-Manager to use only the Filename "TEST.db" without Path, but ends in a FireDAC Error Message: "FireDAC..Stan..Definition .. not found in []"...
i also tried "#scriptdir & "\TEST.db" in the Connections-Manager.. but dont works, too...
how can i save my TEST.db (sql3-database) in the ScriptDir if the Connection is fix via "FDConnection1" ????
thanks for any help..
Greeting.
You could use the BeforeConnect for pointing your FDConnection Object to the correct database file. In the example below it would point to a database file named "test.db3" which is located in your exe's folder:
void __fastcall TForm4::FDConnection1BeforeConnect(TObject *Sender)
{
dynamic_cast< TFDConnection * >( Sender )->Params->Database =
ExtractFileDir( Application->ExeName ) + "\\test.db3";
}
void __fastcall TForm4::Button1Click(TObject *Sender)
{
FDConnection1->Connected = true;
}
best regards, Herwig

Error Connect Database Sqlite in Mule Connector

I want to connect and select database Sqlite on Mule AnypointStudio. But it error. Please help me. Thanks all.
No suitable driver found for jdbc:sqlite
here my code:
#Processor (name="select" ,friendlyName ="select")
public void select() {
ArrayList<Story> list = new ArrayList<Story>();
String sql = "select * from chat";
try (Connection conn = this.connect();
Statement stmt = conn.createStatement();
ResultSet rs = stmt.executeQuery(sql)){
// loop through the result set
while (rs.next()) {
Story s = new Story();
s.setStory(rs.getInt("id"), rs.getString("user_chat"),rs.getString("bot_chat"));
list.add(s);
}
} catch (SQLException | ClassNotFoundException e) {
System.out.println(e.getMessage());
}
for (int i =0 ; i < list.size(); i++){
System.out.print(list.get(i).GetID() +"| "+ list.get(i).GetUserChat() + "| "+ list.get(i).GetBotChat() +"\n" );
}
}
private Connection connect() throws ClassNotFoundException {
// SQLite connection string
Class.forName("org.sqlite.JDBC");
String url = "jdbc:sqlite:C:\\data.db";
Connection conn = null;
try {
conn = DriverManager.getConnection(url);
} catch (SQLException e) {
System.out.println(e.getMessage());
}
return conn;
}
}
Make sure You have valid jar/driver in your project classpath.
Open new Mule Project in Studio, and then follow these steps to add/create a datasource in mule flow:
a. Import the driver
b. Create a Datasource,
c. Create a Connector that uses our Datasource, and finally
d. Create a simple flow that uses our connector.
It seems you are missing driver jar in project classpath.
How to Import the Driver?
Once you have the jar file(you can download jar respective to sqllite from some repo ,eg- maven_repo), the next steps are very simple:
In the Package Explorer,
Right-click over the Project folder
Look in the menu for Build Path > Add External Archives…
Look for the jar file in your hard drive and click Open.
Now you should see in the package explorer that the jar file is present in “Referenced Libraries.”
This will allow you to create an insta
nce of the Object driver you will need.
It's probably a classpath issue. If you are using Maven with your project, simply add the dependency in your pom.xml (and right click on your project > Mule > Update project dependencies):
<dependency>
<groupId>org.xerial</groupId>
<artifactId>sqlite-jdbc</artifactId>
<version>3.20.1</version>
<scope>test</scope>
</dependency>
Make sure you understand how Maven works and how to manipulate your pom.xml file. Maven getting started and POM Introduction might help.
If you are not using Maven, you need to manually import the dependency in your classpath. #Malesh_Loya answer should help.

groovy upload jar to nexus

I have some jar file (custom) which I need to publish to Sonatype Nexus repository from Groovy script.
I have jar located in some path on machine where Groovy script works (for instance: c:\temp\module.jar).
My Nexus repo url is http://:/nexus/content/repositories/
On this repo I have folder structure like: folder1->folder2->folder3
During publishing my jar I need to create in folder3:
New directory with module's revision (my Groovy script knows this revision)
Upload jar to this directory
Create pom, md5 and sha1 files for jar uploaded
After several days of investigation I still have no idea how to create such script but this way looks very clear instead of using direct uploading.
I found http://groovy.codehaus.org/Using+Ant+Libraries+with+AntBuilder and some other stuff (stackoverflow non script solution).
I got how to create ivy.xml in my Groovy script, but I don't understand how to create build.xml and ivysetting.xml on the fly and setup whole system to work.
Could you please help to understand Groovy's way?
UPDATE:
I found that the following command works fine for me:
curl -v -F r=thirdparty -F hasPom=false -F e=jar -F g=<my_groupId> -F a=<my_artifactId> -F v=<my_artifactVersion> -F p=jar -F file=#module.jar -u admin:admin123 http://<my_nexusServer>:8081/nexus/service/local/repositories
As I understand curl perform POST request to Nexus services. Am I correct?
And now I'm trying to build HTTP POST request using Groovy HTTPBuilder.
How I should transform curl command parameters into Groovy's HTTPBuilder request?
Found a way to do this with the groovy HttpBuilder.
based on info from sonatype, and a few other sources.
This works with http-builder version 0.7.2 (not with earlier versions)
And also needs an extra dependency: 'org.apache.httpcomponents:httpmime:4.2.1'
The example also uses basic auth against nexus.
import groovyx.net.http.Method
import groovyx.net.http.ContentType;
import org.apache.http.HttpRequest
import org.apache.http.HttpRequestInterceptor
import org.apache.http.entity.mime.MultipartEntity
import org.apache.http.entity.mime.content.FileBody
import org.apache.http.entity.mime.content.StringBody
import org.apache.http.protocol.HttpContext
import groovyx.net.http.HttpResponseException;
class NexusUpload {
def uploadArtifact(Map artifact, File fileToUpload, String user, String password) {
def path = "/service/local/artifact/maven/content"
HTTPBuilder http = new HTTPBuilder("http://my-nexus.org/")
String basicAuthString = "Basic " + "$user:$password".bytes.encodeBase64().toString()
http.client.addRequestInterceptor(new HttpRequestInterceptor() {
void process(HttpRequest httpRequest, HttpContext httpContext) {
httpRequest.addHeader('Authorization', basicAuthString)
}
})
try {
http.request(Method.POST, ContentType.ANY) { req ->
uri.path = path
MultipartEntity entity = new MultipartEntity()
entity.addPart("hasPom", new StringBody("false"))
entity.addPart("file", new FileBody(fileToUpload))
entity.addPart("a", new StringBody("my-artifact-id"))
entity.addPart("g", new StringBody("my-group-id"))
entity.addPart("r", new StringBody("my-repository"))
entity.addPart("v", new StringBody("my-version"))
req.entity = entity
response.success = { resp, reader ->
if(resp.status == 201) {
println "success!"
}
}
}
} catch (HttpResponseException e) {
e.printStackTrace()
}
}
}
`
Ivy is an open source library, so, one approach would be to call the classes directly. The problem with that approach is that there are few examples on how to invoke ivy programmatically.
Since groovy has excellent support for generating XML, I favour the slightly dumber approach of creating the files I understand as an ivy user.
The following example is designed to publish files into Nexus generating both the ivy and ivysettings files:
import groovy.xml.NamespaceBuilder
import groovy.xml.MarkupBuilder
// Methods
// =======
def generateIvyFile(String fileName) {
def file = new File(fileName)
file.withWriter { writer ->
xml = new MarkupBuilder(writer)
xml."ivy-module"(version:"2.0") {
info(organisation:"org.dummy", module:"dummy")
publications() {
artifact(name:"dummy", type:"pom")
artifact(name:"dummy", type:"jar")
}
}
}
return file
}
def generateSettingsFile(String fileName) {
def file = new File(fileName)
file.withWriter { writer ->
xml = new MarkupBuilder(writer)
xml.ivysettings() {
settings(defaultResolver:"central")
credentials(host:"myrepo.com" ,realm:"Sonatype Nexus Repository Manager", username:"deployment", passwd:"deployment123")
resolvers() {
ibiblio(name:"central", m2compatible:true)
ibiblio(name:"myrepo", root:"http://myrepo.com/nexus", m2compatible:true)
}
}
}
return file
}
// Main program
// ============
def ant = new AntBuilder()
def ivy = NamespaceBuilder.newInstance(ant, 'antlib:org.apache.ivy.ant')
generateSettingsFile("ivysettings.xml").deleteOnExit()
generateIvyFile("ivy.xml").deleteOnExit()
ivy.resolve()
ivy.publish(resolver:"myrepo", pubrevision:"1.0", publishivy:false) {
artifacts(pattern:"build/poms/[artifact].[ext]")
artifacts(pattern:"build/jars/[artifact].[ext]")
}
Notes:
More complex? Perhaps... however, if you're not generating the ivy file (using it to manage your dependencies) you can easily call the makepom task to generate the Maven POM files prior to upload into Nexus.
The REST APIs for Nexus work fine. I find them a little cryptic and of course a solution that uses them cannot support more than one repository manager (Nexus is not the only repository manager technology available).
The "deleteOnExit" File method call ensures the working files are cleaned up properly.

How do I do the equivalent of "git repack -ad" with jgit?

I have implemented a DfsRepository using jgit-2.0.0.201206130900. It works great but I want to repack it so that I only have one packfile. How do I do that via jgit?
Got this working. DfsGarbageCollector basically does the equivalent of repack -d. To get the repack -a behavior, use DfsPackCompactor:
void repack(DfsRepository repo) throws IOException {
DfsGarbageCollector gc = new DfsGarbageCollector(repo);
gc.pack(null);
// update the list of packs for getPacks() below
// otherwise not all packs are compacted
repo.scanForRepoChanges();
// only compact if there are multiple pack files
DfsPackFile[] packs = repo.getObjectDatabase().getPacks();
if (packs.length > 1) {
DfsPackCompactor compactor = new DfsPackCompactor(repo);
for (DfsPackFile pack : packs) {
compactor.add(pack);
}
compactor.compact(null);
}
}
That's not quite all though.
DfsGarbageCollector creates a separate packfile for the garbage.
The easiest way I found to "delete" the garbage packfile was to return a DfsOutputStream from my DfsObjDatabase.writePackFile() implementation that simply threw away the data if the pack file's source was PackSource.UNREACHABLE_GARBAGE.

can't find IfxBulkCopy in IBM.Data.Informix 2.81.0.0

Q:
My question consists of two parts:
1- I want to use the following class IfxBulkCopy to insert large amount of data but this class doesn't exist in the dll IBM.Data.Informix 2.81.0.0 how to fix this problem.?
Note : the class exist in the IBM.Data.Informix 9.0.0.2 !but i can't use this version because we use an old version of the informix.
When i use the new version i get the following exception :
Invalid argument
StackTrace = " at IBM.Data.Informix.IfxConnection.ReplaceConnectionStringParms(String szValue, IfxConnSettings& connSettings)\r\n at IBM.Data.Informix.IfxConnection.set_ConnectionString(String value)\r\n at Common.DBConnectionForInformix..ctor(String ConnectionStr...
My .cs:
public static void InsertAsBulk(DataTable dt)
{
using (IfxConnection cn = new IfxConnection(ConfigurationManager.ConnectionStrings["aa"].ToString()))
{
cn.Open();
using (IfxBulkCopy copy = new IfxBulkCopy(cn))
{
copy.ColumnMappings.Add(1, 2);
copy.ColumnMappings.Add(2, 3);
copy.ColumnMappings.Add(3, 4);
copy.ColumnMappings.Add(4, 5);
copy.ColumnMappings.Add(5, 6);
copy.ColumnMappings.Add(6, 7);
copy.DestinationTableName = "schday";
copy.WriteToServer(dt);
}
}
}
2- Is the IfxBulkCopy use the transaction concept during the insertion operation or may result inconsistent data also .
Your connection string is fine. Do you have multiple versions of driver installed? I had same problem when I installed OAT, Informix driver and Informix Client SDK on same machine.
This solved my problem:
1. Uninstallation of driver and client sdk + windows restart
2. Installation of Client SDK together with data server driver
3. Checking system PATH variable. I added
C:\Program Files\IBM Informix Client SDK\bin
C:\Program Files\IBM Informix Client SDK\bin\netf20
to system PATH. I'm not sure what was the problem, maybe just changing PATH variable (without unistallation) can fix it.

Resources