How to execute a JanusGraph jar file? - gremlin

I managed to build a jar file named "" following this official "Connecting from Java" tutorial.
How can I execute this jar file?
Say I have a HBase jar file, I may execute it in the console by hbase com.HbaseExample or hadoop com.HbaseExample.
What is the magical line to execute a JanusGraph jar? I have attached the example codes below:
import org.apache.tinkerpop.gremlin.process.traversal.dsl.graph.GraphTraversalSource;
import org.apache.tinkerpop.gremlin.structure.Graph;
import org.apache.tinkerpop.gremlin.structure.util.empty.EmptyGraph;
public class executeGremlin {
public static void main(String[] args) throws Exception {
Graph graph = (Graph) EmptyGraph.instance();
GraphTraversalSource g = graph.traversal().withRemote("conf/remote-graph.properties");
Object herculesAge = g.V().has("name", "hercules").values("age").next();
System.out.println("Hercules is " + herculesAge + " years old.");
}
}
And the jar file is named gremlin-example-1.0-SNAPSHOT.jar

The code you posted is expecting to connect to a Gremlin Server. The Server would have to be configured to use JanusGraph. Note that JanusGraph needs a process to launch it (such as your application or a Gremlin Server) and also some backend storage unless you configure it to run as "inmemory". I have a section on setting up JanusGraph in Practical Gremlin. I wrote that section a while back but I think it still should be mostly correct.Definitely check with the official docs also.
http://www.kelvinlawrence.net/book/PracticalGremlin.html#janusintro
There is some sample Java code here that goes along with the examples in the book.
https://github.com/krlawrence/graph/blob/master/sample-code/JanusCassandra.java

Related

FileReader can't find R Script

I try run my R Script within JavaFx. I use Renjin for this purpose and it seems to work properly with statements I run internally. But I want to run an external R Script. The project is set up with Maven so the path should be easy as the R Script is in the resources folder. The path works when I load FXML files, so I'm pretty confused why it can't find my Script.
Here's a short example:
package survey;
import javax.script.*;
import org.renjin.script.*;
import java.io.FileReader;
public class calcFunction {
public static void main(String[] args) throws Exception {
// create a script engine manager:
RenjinScriptEngineFactory factory = new RenjinScriptEngineFactory();
// create a Renjin engine:
ScriptEngine engine = factory.getScriptEngine();
engine.put("x", 4);
engine.put("y", 5);
engine.eval(new FileReader("/test.R"));
}
}
Is something missing? Thanks in advance!
EDIT1:
With my FXML files it works with the "/" path like this:
root = FXMLLoader.load(getClass().getResource("/moduleDa.fxml"));
EDIT2:
Someone who deleted his comment proposed this:
engine.eval(new FileReader(new File(".").getAbsolutePath()+"/test.R"));
It works if the script is in the root directory, where the pom.xml file is located. #James_D made it work so the R script can be located in the resources folder - thanks a lot!
If your R script is bundled as part of the application, it can't be treated as a file - you need to treat it as a resource. Typically, you will deploy your application as a Jar file, and the resources will be elements within that jar file (they won't be files in their own right).
So just treat the R script as a resource and load it as such. I don't know the renjin framework, but I assume ScriptEngine here is a javax.script.ScriptEngine, in which case ScriptEngine.eval(...) takes a Reader as a parameter, and so (if your R script is located in the root of the class path) you can do
engine.eval(new InputStreamReader(getClass().getResourceAsStream("/test.R")));

how to enable Visor Command Line In Apache Ignite?

I have started Apache Ignite server via Maven Dependency trough eclipse,can anyone tell me how to monitor cache through visor command? How to enable it when setup Apache Ignite via Maven?
I think the most easy way is to download binary distributive and lunch Visor command line from "\bin" folder. Note, you need to download release that match to that you are using in your Maven based application.
The second way is to use ignite-visor-console module from Maven
And start Visor command line via: org.apache.ignite.visor.commands.VisorConsole object (it extends App). Note, Visor command line is written on Scala.
Sample code:
import org.apache.ignite.visor.commands.VisorConsole;
public class Test {
static public void main(String args[]) {
VisorConsole.main(args);
}
}
Also see Visor command line documentation.
And also give a try for Web Console, as Dmitriy suggested.

How to run standalone TestNG project from jar/bat/

I have a TestNG project. Don't have any main class, currently it is running like "Run As TestNG".
I want to export it as runnable jar or jar so that any one can just hit a command from command line and test cases start running.
Could any one help me out in this? or suggest any other way to deliver the code in runnable form...
I am not using ant or maven.
Thanks
I seem to have found the solution after a bit of googling. This works fine in Eclipse (Juno).
Say, you have a TestNG file named 'Tests.java'. As you rightly pointed out, there won't be a class with main method.
So, we have to create a new Java class file under the same package. Let us name it 'MainOne.java'. This will have a class with main method.
Here is the code you need:
import com.beust.testng.TestNG;
public class MainOne {
public static void main(String[] args) {
TestNG testng = new TestNG();
Class[] classes = new Class[]{Tests.class};
testng.setTestClasses(classes);
testng.run();
}
Run the 'MainOne.java' as a Java application. Then right click on the package -> Export -> Runnable Jar [Choose 'MainOne' as Launch Configuration] -> Finish.
My current understanding is that, in order to benefit from the parallel niftiness of TestNG, one should use the static main method in org.testng's jar file when running the Java class from the command line rather than from inside Eclipse IDE.
The issue then becomes classpath, which defines how java finds all the JAR files. I found http://javarevisited.blogspot.com/2012/10/5-ways-to-add-multiple-jar-to-classpath-java.html to be most useful because it has the * wildcard mentioned --- VERY helpful when you need to reference all the jar files required for Selenum + TestNG + custom test suites.
This is my current Windows BAT file, and it works. ADV.jar contains my custom class but no main method.
setlocal
set r=d:\Apps\Selenium\
cd /d %~dp0
java -classpath %r%Downloaded\*;%r%MyCompany\ADV.jar; org.testng.TestNG .\testng-customsuite-adv.xml
pause
All the JAR files that I downloaded from public places went into my d:\Apps\Selenium\Downloaded folder. I put my custom ADV.jar file in d:\Apps\Selenium\MyCompany to keep it separate.
I created my ADV.jar file from Eclipse using Export Jar file and ignored warnings about a missing main method.
Aside: while this https://stackoverflow.com/a/16879386/424855 was very intriguing, I could not figure out how to make that work.
Here is the better way to do it.
You can just create a main method which will have list of all test classes to be executed as follows:
public static void main(String[] args) {
TestListenerAdapter tla = new TestListenerAdapter();
TestNG testng = new TestNG();
testng.setTestClasses(new Class[] { test_start.class });
testng.addListener(tla);
testng.run();
}
Here is the reference URL from the official testng website.
Run the MainOne.java as a Java application. Then right click on the package -> Export -> Runnable Jar [Choose MainOne as Launch Configuration] -> Finish.

Output directory not set in JobConf

I am mentioning below the driver code of a simple mapR program
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapred.JobClient;
import org.apache.hadoop.mapred.JobConf;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
#SuppressWarnings("deprecation")
public class CsvParserDriver {
#SuppressWarnings("deprecation")
public static void main(String[] args) throws Exception
{
if(args.length != 2)
{
System.out.println("usage: [input] [output]");
System.exit(-1);
}
JobConf conf = new JobConf(CsvParserDriver.class);
Job job = new Job(conf);
conf.setJobName("CsvParserDriver");
FileInputFormat.setInputPaths(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
conf.setMapperClass(CsvParserMapper.class);
conf.setMapOutputKeyClass(IntWritable.class);
conf.setMapOutputValueClass(Text.class);
conf.setReducerClass(CsvParserReducer.class);
conf.setOutputKeyClass(Text.class);
conf.setOutputValueClass(Text.class);
conf.set("splitNode","NUM_AE");
JobClient.runJob(conf);
}
}
I am running my code using the below command
hadoop jar CsvParser.jar CsvParserDriver /user/sritamd/TestData /user/sritamd/output
(All the respective jars and directories in the above command are created)
I get the error as
Exception in thread "main" org.apache.hadoop.mapred.InvalidJobConfException: Output directory not set in JobConf.
You didn't create HDFS input and output directories as it was specified in the apache-hadoop-tutorial.
If you want to use local directory file:///user/sritamd/TestData - add FS prefix.
This might be caused by old API and new API.
Here is my new Job API to do configuration.
Step1: import new API lib
import org.apache.hadoop.mapreduce.Job
Step2: do configuration by new API job.
val job = Job.getInstance(conf)
job.getConfiguration.set(TableOutputFormat.OUTPUT_TABLE, tableName)
job.setOutputFormatClass(classOf[TableOutputFormat[Put]])
Hope this can help you.
Try this
Configuration configuration = new Configuration();
Job job = new Job(configuration, "MyConfig");
then
FileInputFormat.setInputPaths(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
Your HDFS filesystem might not be created you need to first format a given directory and the that directory can be used as input and output of files for Hadoop
/usr/local/hadoop/bin/hadoop namenode -format
Use link :-http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
and follow each step
I think you need to set the input and output directory to conf instead of job Like:
FileInputFormat.setInputPaths(conf, new Path(args[0]));
FileOutputFormat.setOutputPath(conf, new Path(args[1]));
if you are running hadoop on standard mode (without cluster) for testing the code you dont need to have fs prefix to the output path. You can initialize Job and set the paths. Following code should work(make sure you are either using Job ( from org.apache.hadoop.mapreduce.Job) or JobConf from org.apache.hadoop.mapred.JobConf)
Job job = new Job();
job.setJobName("Job Name");
job.setJarByClass(MapReduceJob.class);
FileInputFormat.setInputPaths(job,new Path(args[0]));
FileOutputFormat.setOutputPath(job,new Path(args[1]));
job.setMapperClass(MaxTemperatureMapper.class);
job.setReducerClass(MaxTemperatureReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
System.exit(job.waitForCompletion(true)? 0:1);
I had same issue but fixed it. I used job.waitForCompletion(true) which caused spark on hbase to crash when using saveAsNewAPIHadoopFile(...).A
you should not wait for your job since it is using the old Hadoop api instead of the new API
First make sure that your directory is not already exist. If it exist Delete it.
Second run your code in Eclipse, if it run properly and gives ArrayOutofBounds warning.
Otherwise, check your library that you inserted make sure to insert all CLIENT Libraries OR check that your class is in a package.
If all the above conditions meet your job will execute.

Where is class weblogic.jndi.WLInitialContextFactory?

when trying to execute my jar file I get an exception:
javax.naming.NoInitialContextException: Cannot instantiate class: weblogic.jndi.WLInitialContextFactory
[Root exception is java.lang.ClassNotFoundException: weblogic.jndi.WLInitialContextFactory]
I guess this is some kind of missing library on the classpath.
Can anyone tell me which jar-file is missing? I can't find the class weblogic.jndi.WLInitialContextFactory anywhere...
Thanks!
P.S.: I already have weblogic 10.0 jar included.
Check your server/lib/ folder to find wliclient.jar.
With Weblogic 12.1.3, you can find it here:
${INSTALL_DIR}/inventory/wlserver/server/lib/wlclient.jar
Step 1:
Go to E:\weblogic81\user_projects\domains\mydomain. Then type Setenv command. As follows
E:\weblogic81\user_projects\domains\mydomain>setenv
Step 2:
Weblogic.jar file is needed by your client application. It may contain in the following path E:\weblogic81\weblogic81\server\lib\weblogic.jar. so set the classpath for the this folder or copy this weblogic.jar file into your application-folder so that weblogic.jar file is available to your application first.
E:\weblogic81\user_projects\domains\mydomain>set CLASSPATH=%CLASSPATH%;E:\weblogic81\weblogic81\server\lib;.
Step 3:
Go to domain folder in command prompt as shown above and set classpath.
To not to disturb other classpaths set classpath as:
set CLASSPATH=%CLASSPATH%;E:\weblogic81\weblogic81\server\lib;.
Here (.) dot represents set classpath to current directory.
Step 4:
After classpath set run command STARTWEBLOGIC as follows:
E:\weblogic81\user_projects\domains\mydomain>STARTWEBLOGIC
Step 5:
Do not login to weblogic server. If you are already login just log out and write the following code in myeclipse or some other IDE.
Step 6:
package directory.service;
import java.util.*;
import weblogic.jndi.*;
import java.io.FileInputStream;
import javax.naming.*;
public class GetInitContext {
/**
* #param args
*/
public static void main(String[] args) {
try{
weblogic.jndi.Environment env=new weblogic.jndi.Environment();
weblogic.jndi.Environment environment = new weblogic.jndi.Environment();
environment.setInitialContextFactory(
weblogic.jndi.Environment.DEFAULT_INITIAL_CONTEXT_FACTORY);
env.setProviderUrl("t3://localhost:7001");
env.setSecurityPrincipal("agni");
env.setSecurityCredentials("agnidevam");
Context context=env.getInitialContext();
System.out.println("got the initial context for weblogic server---> "+context);
context.createSubcontext("sone");
context.bind("agni one",new Integer(10));
context.createSubcontext("sone/sctwo");
context.bind("agni two",new Integer(20));
context.createSubcontext("sone/sctwo/scthree");
context.bind("agni three",new Integer(30));
System.out.println("subcontex object created please check in admin server for more details");
}
catch(Exception e){
System.out.println("file inputstream exception ---> "+e);
}
}
}
Step 7:
Execute the above code and login to weblogic and right click on myserver>view jndi tree> you find the bound objects information.
it looks you are doing a JNDI lookup outside of WLS.
You need to use wlfulclient.jar or if your machine has a WLS installation then add to your classpath project: WL_HOME/server/lib/weblogic.jar
I faced the same issue and it's fixed now :)
The fix is, to go to WebLogic server and navigate to /Oracle/Middleware/wlserver_10.3/server/lib/ and execute the below command.
Command: java -jar wljarbuilder.jar -profile wlfullclient5
The above command creates a jar file with all the jar's inside WebLogic server /lib folder and place it in your client java code build path Eclipse and craetes runnable JAR file and place this wlfullclient5.jar file in server/lib folder as well.
Hope this helps! Kindly let me know if you have any issues.
Adding wlserver/server/lib/weblogic.jar is enough. I test it.
Check the following tag in your build.xml
property name="WLS_HOME" value="${env.WLS_HOME}"
where WLS_HOME=c:\weblogic\wls\wlserver if running on windows
i kept trying to run a simple hello world program and it kept throwing
*run:
[echo] Executing client class
[java] javax.naming.NoInitialContextException: Cannot instantiate class: weblogic.jndi.WLInitialContextFactory [Root exception is java.lang.ClassNotFoundException: weblogic.jndi.WLInitialContextFactory]*
once i changed the above mentioned tag it in the build.xml it worked fine
It is packaged inside of the weblogic.jar under your server/lib.
in version 12c it is located in weblogic-classes.jar in your lib directory:
C:\wls1213\wlserver\server\lib
For WLS 12.2, where WL_HOME is The BEA home directory of your WebLogic installation
(as defualt WL_HOME is Middleware\Oracle_Home\wlserver)
%WL_HOME%\server\lib\wlclient.jar
%WL_HOME%\server\lib\wls-api.jar
%WL_HOME%\server\lib\wls-api-part.jar
%WL_HOME%\server\lib\wlthint3client.jar
all these libs contains the: jar: weblogic\jndi\WLInitialContextFactory.class
see WLS doc.: https://docs.oracle.com/en/middleware/fusion-middleware/weblogic-server/12.2.1.4/wlprg/overview.html#GUID-FC14CC53-DE49-456F-B54C-D73CC6DBF818
I've faced the issue stated here and I've managed to solved by fixing WL_HOME enviroment variable.
In my case the wlserver_10.3 folder was moved to another drive (From D to E) and the guy who did the disk "migration" forgot to change the WL_HOME value at PATH\TO\Oracle\Middleware\wlserver_10.3\common\bin
By fixing the wlserver_10.3 path I was able to deploy JAR's at WebLogic

Resources