Error Connect Database Sqlite in Mule Connector - sqlite

I want to connect and select database Sqlite on Mule AnypointStudio. But it error. Please help me. Thanks all.
No suitable driver found for jdbc:sqlite
here my code:
#Processor (name="select" ,friendlyName ="select")
public void select() {
ArrayList<Story> list = new ArrayList<Story>();
String sql = "select * from chat";
try (Connection conn = this.connect();
Statement stmt = conn.createStatement();
ResultSet rs = stmt.executeQuery(sql)){
// loop through the result set
while (rs.next()) {
Story s = new Story();
s.setStory(rs.getInt("id"), rs.getString("user_chat"),rs.getString("bot_chat"));
list.add(s);
}
} catch (SQLException | ClassNotFoundException e) {
System.out.println(e.getMessage());
}
for (int i =0 ; i < list.size(); i++){
System.out.print(list.get(i).GetID() +"| "+ list.get(i).GetUserChat() + "| "+ list.get(i).GetBotChat() +"\n" );
}
}
private Connection connect() throws ClassNotFoundException {
// SQLite connection string
Class.forName("org.sqlite.JDBC");
String url = "jdbc:sqlite:C:\\data.db";
Connection conn = null;
try {
conn = DriverManager.getConnection(url);
} catch (SQLException e) {
System.out.println(e.getMessage());
}
return conn;
}
}

Make sure You have valid jar/driver in your project classpath.
Open new Mule Project in Studio, and then follow these steps to add/create a datasource in mule flow:
a. Import the driver
b. Create a Datasource,
c. Create a Connector that uses our Datasource, and finally
d. Create a simple flow that uses our connector.
It seems you are missing driver jar in project classpath.
How to Import the Driver?
Once you have the jar file(you can download jar respective to sqllite from some repo ,eg- maven_repo), the next steps are very simple:
In the Package Explorer,
Right-click over the Project folder
Look in the menu for Build Path > Add External Archives…
Look for the jar file in your hard drive and click Open.
Now you should see in the package explorer that the jar file is present in “Referenced Libraries.”
This will allow you to create an insta
nce of the Object driver you will need.

It's probably a classpath issue. If you are using Maven with your project, simply add the dependency in your pom.xml (and right click on your project > Mule > Update project dependencies):
<dependency>
<groupId>org.xerial</groupId>
<artifactId>sqlite-jdbc</artifactId>
<version>3.20.1</version>
<scope>test</scope>
</dependency>
Make sure you understand how Maven works and how to manipulate your pom.xml file. Maven getting started and POM Introduction might help.
If you are not using Maven, you need to manually import the dependency in your classpath. #Malesh_Loya answer should help.

Related

How to generate swagger.json [duplicate]

This question already has answers here:
How to export swagger.json (or yaml)
(7 answers)
Closed 3 years ago.
I am using java spring boot framework to create REST api for my project and I am using "springfox-swagger2 and springfox-swagger-ui" for generating swagger documentation. I am able to see my documentation using the URL http://localhost:8080/swagger-ui.html.
How can I create or generate swagger.json / spec.json, The documentation should not be with this application, we are using a separate application for listing the API docs.
You can get the url with your swagger-ui html page:
GET http://localhost:8080/v2/api-docs?group=App
And actually you can get all the urls with chrome/firefox develop tools network feature.
If you use Maven, you can generate client and server side documentation (yaml, json and html) by using swagger-maven-plugin
Add this to your pom.xml:
.....
<plugin>
<groupId>com.github.kongchen</groupId>
<artifactId>swagger-maven-plugin</artifactId>
<version>3.0.1</version>
<configuration>
<apiSources>
<apiSource>
<springmvc>true</springmvc>
<locations>com.yourcontrollers.package.v1</locations>
<schemes>http,https</schemes>
<host>localhost:8080</host>
<basePath>/api-doc</basePath>
<info>
<title>Your API name</title>
<version>v1</version>
<description> description of your API</description>
<termsOfService>
http://www.yourterms.com
</termsOfService>
<contact>
<email>your-email#email.com</email>
<name>Your Name</name>
<url>http://www.contact-url.com</url>
</contact>
<license>
<url>http://www.licence-url.com</url>
<name>Commercial</name>
</license>
</info>
<!-- Support classpath or file absolute path here.
1) classpath e.g: "classpath:/markdown.hbs", "classpath:/templates/hello.html"
2) file e.g: "${basedir}/src/main/resources/markdown.hbs",
"${basedir}/src/main/resources/template/hello.html" -->
<templatePath>${basedir}/templates/strapdown.html.hbs</templatePath>
<outputPath>${basedir}/generated/document.html</outputPath>
<swaggerDirectory>generated/swagger-ui</swaggerDirectory>
<securityDefinitions>
<securityDefinition>
<name>basicAuth</name>
<type>basic</type>
</securityDefinition>
</securityDefinitions>
</apiSource>
</apiSources>
</configuration>
</plugin>
........
You can download *.hbs template at this address:
https://github.com/kongchen/swagger-maven-example
Execute mvn swagger:generate
JSon documentation will be generated at your project /generated/swagger/ directory.
Past it on this address :
http://editor.swagger.io
And generate what ever you want ( Server side or Client side API in your preferred technology )
I'm a little late here, but I just figured out that you can open your browser console and find the URL to the GET request that returns the JSON definition for your Swagger docs. The following technique worked for me when mapping my API to AWS API Gateway.
To do this:
Navigate to your Swagger docs endpoint
Open the browser console
Refresh the page
Navigate to the network tab and filter by XHR requests
Right click on the XHR request that ends in ?format=openapi
You can now just copy and paste that into a new JSON file!
I have done this with a small trick
I have added the following code in the end of my home controller test case
import org.springframework.boot.test.web.client.TestRestTemplate;
public class HomeControllerTest extends .... ...... {
#Autowired
private TestRestTemplate restTemplate;
#Test
public void testHome() throws Exception {
//.......
//... my home controller test code
//.....
String swagger = this.restTemplate.getForObject("/v2/api-docs", String.class);
this.writeFile("spec.json", swagger );
}
public void writeFile(String fileName, String content) {
File theDir = new File("swagger");
if (!theDir.exists()) {
try{
theDir.mkdir();
}
catch(SecurityException se){ }
}
BufferedWriter bw = null;
FileWriter fw = null;
try {
fw = new FileWriter("swagger/"+fileName);
bw = new BufferedWriter(fw);
bw.write(content);
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
if (bw != null)
bw.close();
if (fw != null)
fw.close();
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
}
I don't know this is right way or not But it is working :)
Dependency
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger2</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger-ui</artifactId>
<version>2.6.1</version>
</dependency>
You should be able to get your swagger.json at
http://localhost:8080/api-docs
assuming your don't have kept the versioning as in the pet store sample application. In that case the URL would be:
http://localhost:8080/v2/api-docs
To get the api json definition for REST API, if swagger is configured properly. you can use directly swagger/docs/v1, this means the complete url will be, if version v1 (or just specify the version)
http://localhost:8080/swagger/docs/v1

How to launch local DynamoDB programmatically?

I am able to launch a local DynamoDB server from bash through this command:
java -Djava.library.path=./DynamoDBLocal_lib -jar DynamoDBLocal.jar -sharedDb &
Is there not a pure-java way to start the server in one's code? I don't mean a java callout to the shell through the Process object but a way such that when I run my app, the server starts, and when my app is killed, the server is killed.
I can live with an embedded database if such a mode exists, though something that reflects server consistency semantics would be ideal.
EDIT: September 23rd 2015
There was an announcement on Aug 3, 2015 that now adds the ability to have an embedded DynamoDB local running in the same process. You can add a Maven test dependency and use one of the ways below to run it.
<!--Dependency:-->
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>DynamoDBLocal</artifactId>
<version>[1.11,2.0)</version>
</dependency>
</dependencies>
<!--Custom repository:-->
<repositories>
<repository>
<id>dynamodb-local-oregon</id>
<name>DynamoDB Local Release Repository</name>
<url>https://s3-us-west-2.amazonaws.com/dynamodb-local/release</url>
</repository>
</repositories>
And here is an example taken from the awslabs/aws-dynamodb-examples Github repository:
AmazonDynamoDB dynamodb = null;
try {
// Create an in-memory and in-process instance of DynamoDB Local that skips HTTP
dynamodb = DynamoDBEmbedded.create().amazonDynamoDB();
// use the DynamoDB API with DynamoDBEmbedded
listTables(dynamodb.listTables(), "DynamoDB Embedded");
} finally {
// Shutdown the thread pools in DynamoDB Local / Embedded
if(dynamodb != null) {
dynamodb.shutdown();
}
}
// Create an in-memory and in-process instance of DynamoDB Local that runs over HTTP
final String[] localArgs = { "-inMemory" };
DynamoDBProxyServer server = null;
try {
server = ServerRunner.createServerFromCommandLineArgs(localArgs);
server.start();
dynamodb = AmazonDynamoDBClientBuilder.standard().withEndpointConfiguration(
// we can use any region here
new AwsClientBuilder.EndpointConfiguration("http://localhost:8000", "us-west-2"))
.build();
// use the DynamoDB API over HTTP
listTables(dynamodb.listTables(), "DynamoDB Local over HTTP");
} finally {
// Stop the DynamoDB Local endpoint
if(server != null) {
server.stop();
}
}
Old answer
Like you said, there is currently no built-in way from DynamoDBLocal or the SDK to do this right now. It would be nice if there was an embedded DynamoDBLocal that you could start up in the same process.
Here is a simple workaround/solution using java.lang.Process to start it up and shut it down programmatically in case others are interested.
Documentation for DynamoDBLocal can be found here and here are the current definition of the arguments:
-inMemory — Run in memory, no file dump
-port 4000 — Communicate using port 4000.
-sharedDb — Use a single database file, instead of separate files for each credential and region
Note that this is using the most recent version of DynamoDBLocal as of August 5th, 2015.
final ProcessBuilder processBuilder = new ProcessBuilder("java",
"-Djava.library.path=./DynamoDBLocal_lib",
"-jar",
"DynamoDBLocal.jar",
"-sharedDb",
"-inMemory",
"-port",
"4000")
.inheritIO()
.directory(new File("/path/to/dynamo/db/local"));
final Process process = processBuilder.start();
Runtime.getRuntime().addShutdownHook(new Thread() {
#Override
public void run() {
System.out.println("Shutdown DynamoDBLocal");
process.destroy();
try {
process.waitFor(3, TimeUnit.SECONDS);
} catch (InterruptedException e) {
System.out.println("Process did not terminate after 3 seconds.");
}
System.out.println("DynamoDBLocal isAlive=" + process.isAlive());
}
});
// Do some stuff
Write a gradle task to extract the Dynamodb-Local zip and now you can use https://github.com/marcoVermeulen/gradle-spawn-plugin gradle plugin to launch the dynamodb local. It is very easy to use and no need to do any process builder magic.
Sample code -
// to start dynamodb-local
task launch(type: SpawnProcessTask) {
println("Launching....")
command "java -Djava.library.path=/location/to/dynamodb-local/DynamoDBLocal_lib -jar /location/to/dynamodb-local/DynamoDBLocal.jar -inMemory -delayTransientStatuses"
ready "Initializing DynamoDB Local"
}
// to stop dynamodb-local process
task stop(type: KillProcessTask)

Red5: Server application skeleton and helloworld

Can anyone provide an updated application skeleton for a Red5 application? From what I have found the logging system changed from Log4j. I've been looking for some tutorials just to setup everything but can't really find something that simply works.
In addiction, can anyone provide a simple tutorial with a server application and Flex client?
Thanks in advance!
I struggled a lot with that.. This reference worked for me:
http://fossies.org/unix/privat/red5-1.0.0-RC2.tar.gz:a/red5-1.0.0/doc/reference/html/logging-setup.html
The trick was to Remove any log4j.properties or log4j.xml files and Remove any "log4j" listeners from the web.xml
Create a logback-myApp.xml where myApp is the name for your webapp and place it on your webapp classpath (WEB-INF/classes or in your application jar within WEB-INF/lib)
and im my app i did:
import org.slf4j.Logger;
import org.red5.logging.Red5LoggerFactory;
and then:
private static Logger log = Red5LoggerFactory.getLogger(MyClassName.class, "myApp");
the clients actionscript looks like this:
// Initializiing Connection
private function initConnection():void{
nc = new NetConnection();
nc.client = new NetConnectionClient();
nc.objectEncoding = flash.net.ObjectEncoding.AMF0;
nc.connect(rtmpPath.text,true); //Path to FMS Server e.g. rtmp://<hostname>/<application name>
nc.addEventListener("netStatus", publishStream); //Listener to see if connection is successful
}
private function publishStream(event:NetStatusEvent):void{
if(nc.connected){
nsPublish = new NetStream(nc); //Initializing NetStream
nsPublish.attachCamera(Camera.getCamera());
nsPublish.attachAudio(Microphone.getMicrophone()); //Attaching Camera & Microphone
nsPublish.publish(streamName.text,'live'); //Publish stream
mx.controls.Alert.show("Published");
}
else{
mx.controls.Alert.show("Connection Error");
}
}

Flyway output to SQL File

Is it possible to output the db migration to an SQL file instead of directly invoking database changes in flyway?
Most times this will not be needed as with Flyway the DB migrations themselves will already be written in SQL.
Yes it's possible and as far as I am concerned the feature is an absolute must for DBAs who don't want to allow flyway in prod.
I made do with modifying code from here, it's a dry run command for flyway, you can add a filewriter and write out migrationDetails:
https://github.com/killbill/killbill/commit/996a3d5fd096525689dced825eac7a95a8a7817e
I did it like so... Project structure (just copied it out of killbill's project and renamed package to flywaydr:
.
./main
./main/java
./main/java/com
./main/java/com/flywaydr
./main/java/com/flywaydr/CapturingMetaDataTable.java
./main/java/com/flywaydr/CapturingSqlMigrationExecutor.java
./main/java/com/flywaydr/DbMigrateWithDryRun.java
./main/java/com/flywaydr/MigrationInfoCallback.java
./main/java/com/flywaydr/Migrator.java
./main/java/org
./main/java/org/flywaydb
./main/java/org/flywaydb/core
./main/java/org/flywaydb/core/FlywayWithDryRun.java
In Migrator.java add (implement callback and put it in DbMigrateWithDryRun.java) :
} else if ("dryRunMigrate".equals(operation)) {
MigrationInfoCallback mcb = new MigrationInfoCallback();
flyway.dryRunMigrate();
MigrationInfoImpl[] migrationDetails = mcb.getPendingMigrationDetails();
if(migrationDetails.length>0){
writeMasterScriptToFile(migrationDetails);
}
}
Then to write stuff to file something like:
private static void writeMasterScriptToFile(MigrationInfoImpl[] migrationDetails){
FileWriter fw = null;
try{
String masterScriptLoc="path/to/file";
fw = new FileWriter(masterScriptLoc);
LOG.info("Writing output to " + masterScriptLoc);
for (final MigrationInfoImpl migration : migrationDetails){
Path file =Paths.get(migration.getResolvedMigration().getPhysicalLocation());
//if you want to copy actual script files parsed by flyway
Files.copy(file, Paths.get(new StringBuilder(scriptspathloc).append(File.separator).append(file.getFileName().toString()).toString()), REPLACE_EXISTING);
}
//or just get the sql
for (final SqlStatement sqlStatement : sqlStatements) {
//sqlStatement.getSql();
}
fw.write(stuff.toString());
} catch(Exception e){
LOG.error("Could not write to file, io exception was thrown.",e);
} finally{
try{fw.close();}catch(Exception e){LOG.error("Could not close file writer.",e);}
}
}
One last thing to mention, I compile and package this into a jar "with dependencies" (aka fatjar) via maven (google assembly plugin + jar with dependencies) and run it via command like below or you can include it as a dependency and call it via mvn exec:exec goal, which is something I had success with as well.
$ java -jar /path/to/flywaydr-fatjar.jar dryRunMigrate -regular.flyway.configs -etc -etc
I didnt find a way. Switched to mybatis migration. Looks quite nice.

ejb client in netbeans platform application module

Im trying to create a ejb Client in Netbean platform appliactaions module, to call ejb deployed in glassfish.
I have added all required jar files from my ejb server application and required glashfish jars.
appserv-rt.jar,javaee.jar, gf-client.jar.
Following code works fine when called from standalone java application , but when i try to call it from netbeans platfrom application module , Im unable to get context.
Are there any netbean platform specific configurations required?
try {
Properties props = new Properties();
props.setProperty("java.naming.factory.initial",
"com.sun.enterprise.naming.SerialInitContextFactory");
props.setProperty("java.naming.factory.url.pkgs", "com.sun.enterprise.naming");
props.setProperty("java.naming.factory.state",
"com.sun.corba.ee.impl.presentation.rmi.JNDIStateFactoryImpl"
);
props.setProperty("org.omg.CORBA.ORBInitialHost", "127.0.0.1");
props.setProperty("org.omg.CORBA.ORBInitialPort","3700");
InitialContext ctx = new InitialContext (props);
MySessionBeanRemote mySessionBean=
(MySessionBeanRemote)ctx
.lookup("sessions.MySessionBeanRemote");
Userprofile user = new Userprofile();
user.setActive('A');
user.setDescription("some desc");
user.setEmail("abc");
user.setFirstname("xyz");
user.setLastname("123");
user.setPassword("pwd");
user.setStatus("Enabled");
user.setUserid(Long.valueOf(25));
user.setUsername("abc");
mySessionBean.persist(user);
} catch (javax.naming.NamingException ne) {
ne.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
This is not an easy task. Have a look at the this tutorial, its a bit old but should still apply.

Resources