Infinispan cluster with Karaf instances - apache-karaf

we are very new to Infinispan and also quite new to Apache Karaf. Installing Infinispan in Karaf was easy, we did write two OSGi Bundles to form a cluster with two nodes that run on one host. We tried it with the tutorial for a distributed cache from the Infinispan website (tutorial). Unfortunately the cluster seems not to be build and we can't determine why. Any help or push in the right direction would be very appreciated.
The code of the bundle that writes something in the cache looks like that:
import org.infinispan.Cache;
import org.infinispan.configuration.cache.CacheMode;
import org.infinispan.configuration.cache.ConfigurationBuilder;
import org.infinispan.configuration.global.GlobalConfigurationBuilder;
import org.infinispan.manager.DefaultCacheManager;
import org.infinispan.context.Flag;
import org.osgi.framework.BundleActivator;
import org.osgi.framework.BundleContext;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class CacheProducer implements BundleActivator{
private static Logger LOG = LoggerFactory.getLogger(CacheProducer.class );
private static DefaultCacheManager cacheManager;
#Override
public void start( BundleContext context ) throws Exception{
LOG.info( "Start Producer" );
GlobalConfigurationBuilder global = GlobalConfigurationBuilder.defaultClusteredBuilder();
global.transport().clusterName("ClusterTest");
// Make the default cache a distributed synchronous one
ConfigurationBuilder builder = new ConfigurationBuilder();
builder.clustering().cacheMode(CacheMode.DIST_SYNC);
// Initialize the cache manager
cacheManager = new DefaultCacheManager(global.build(), builder.build());
// Obtain the default cache
Cache<String, String> cache = cacheManager.getCache();
cache.put( "message", "Hello World!" );
LOG.info( "Producer: whole cluster content!" );
cache.entrySet().forEach(entry -> LOG.info(entry.getKey()+ ": " + entry.getValue()));
LOG.info( "Producer: current cache content!" );
cache.getAdvancedCache().withFlags(Flag.SKIP_REMOTE_LOOKUP)
.entrySet().forEach(entry -> LOG.info(entry.getKey()+ ": " + entry.getValue()));
}
#Override
public void stop( BundleContext context ) throws Exception{
cacheManager.stop();
}
}
And the one that tries to print out what is in the cache like that:
package metdoc81.listener;
import org.infinispan.configuration.cache.CacheMode;
import org.infinispan.configuration.cache.ConfigurationBuilder;
import org.infinispan.configuration.global.GlobalConfigurationBuilder;
import org.osgi.framework.BundleActivator;
import org.osgi.framework.BundleContext;
import org.infinispan.Cache;
import org.infinispan.manager.DefaultCacheManager;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class Activator implements BundleActivator{
private static Logger LOG = LoggerFactory.getLogger(Activator.class);
private static DefaultCacheManager cacheManager;
public void start( BundleContext bundleContext ) throws Exception{
LOG.info("start cluster listener");
GlobalConfigurationBuilder global = GlobalConfigurationBuilder.defaultClusteredBuilder();
global.transport().clusterName("ClusterTest");
// Make the default cache a distributed synchronous one
ConfigurationBuilder builder = new ConfigurationBuilder();
builder.clustering().cacheMode(CacheMode.DIST_SYNC);
// Initialize the cache manager
cacheManager = new DefaultCacheManager(global.build(), builder.build());
// Obtain the default cache
Cache<String, String> cache = cacheManager.getCache();
LOG.info("After configuration");
cache.entrySet().forEach(entry -> LOG.info(entry.getKey()+ ": " + entry.getValue()));
LOG.info("After logging");
}
public void stop( BundleContext bundleContext ) throws Exception{
}
}
The printing from the CacheProducer works, printing from the Listener does not.

We found the solution ourselves.
The problem just occurs when you try to run the code on MacOS, on Windows it's working. According to a discussion at JBossDeveloper there was a problem with the multicast routing on MacOS. Even though they added a workaround into the example code, you still have to add the -Djava.net.preferIPv4Stack=true Flag when running it or you have to add these two lines of code:
Properties properties = System.getProperties();
properties.setProperty( "java.net.preferIPv4Stack", "true" );

Related

How to Access JavaFX Virtual Keyboard (FXVK) Using Open JDK 15 or beyond?

I use the javafx virtual keyboard with open jdk 8. At times I have to access the virtual keyboard to prevent it from displaying when certain text fields get focus. An example of this is a screen where an operator has to scan in multiple barcodes. This virtual keyboard gets in the way. With open jdk 8 we were able to disable the virtual keyboard like this:
FXVK.detach(); //after importing "com.sun.javafx.scene.control.skin.FXVK"
We are now upgrading to open jdk 15 and building our UI with gradle. "com.sun.javafx.scene.control.skin.FXVK" is no longer accessible with a modular project with gradle. I don't believe using a different virtual keyboard is an option so can anyone explain how to access this FXVK class after java 8?
Is there a way to use --add-exports or --patch-module with a JAR to patch JavaFX to gain access to the internal class?
Below is the code for a sample project that shows this problem.
This is the JavaFX Application class that simply displays a text field and shows the code I could use with java 8 to not show the virtual keyboard.
package com.test.sampleapp.application;
////not accessible in java 15
//import com.sun.javafx.scene.control.skin.FXVK;
import javafx.application.Application;
import javafx.beans.value.ChangeListener;
import javafx.beans.value.ObservableValue;
import javafx.scene.Scene;
import javafx.scene.control.Label;
import javafx.scene.control.TextField;
import javafx.scene.layout.VBox;
import javafx.stage.Stage;
public class Main extends Application{
public static void main(String[] args)
{
launch(args);
}
#Override
public void start(Stage primaryStage) throws Exception
{
Label label = new Label("Text field below");
TextField textField = new TextField();
VBox vbox = new VBox(label);
vbox.getChildren().add(textField);
Scene scene = new Scene(vbox);
primaryStage.setScene(scene);
primaryStage.show();
textField.focusedProperty().addListener(new ChangeListener<Boolean>()
{
#Override
public void changed(ObservableValue<? extends Boolean> observable, Boolean oldValue,
Boolean newValue)
{
// If focused
if (newValue)
{
//Need this to disable the virtual keyboard when using a textfield with scanning
//FXVK.detach();
}
}
});
}
}
Then I needed to add a wrapper class to have the virtual keyboard show up. Please note that most of the time I do use the virtual keyboard when text fields get focus, it's other times where I need to be able to programmatically disable it during certain situations.
The wrapper class:
package com.test.sampleapp.application;
import java.lang.reflect.Method;
public class AppWrapper
{
public static void main(String[] args) throws Exception
{
Class<?> app = Class.forName("com.test.sampleapp.application.Main");
Method main = app.getDeclaredMethod("main", String[].class);
System.setProperty("com.sun.javafx.isEmbedded", "true");
System.setProperty("com.sun.javafx.touch", "true");
System.setProperty("com.sun.javafx.virtualKeyboard", "javafx");
Object[] arguments = new Object[]{args};
main.invoke(null, arguments);
}
}
Let me know if you need anything else such as the build.gradle file however this is mostly just an issue using java 9 or beyond.
The FXVK class still exists in the same package, so the only issue is that its package is not exported by the javafx.controls module. If you must use this internal class, then you can pass an appropriate --add-exports JVM argument both at compile-time and at run-time.
Here's a simple application that calls FXVK#detach():
// Will fail at compile-time if the '--add-exports` argument is not
// passed to 'javac'
import com.sun.javafx.scene.control.skin.FXVK;
import javafx.application.Application;
import javafx.scene.Scene;
import javafx.scene.control.Label;
import javafx.scene.layout.StackPane;
import javafx.stage.Stage;
public class Main extends Application {
#Override
public void start(Stage primaryStage) {
var root = new StackPane(new Label("Hello, World!"));
primaryStage.setScene(new Scene(root, 600, 400));
primaryStage.show();
// Will fail at run-time if the '--add-exports' argument is
// not passed to 'java'
FXVK.detach();
}
}
Assuming you put the Main.java file in your working directory, you can compile it with:
javac -p <path-to-fx> --add-modules javafx.controls --add-exports javafx.controls/com.sun.javafx.scene.control.skin=ALL-UNNAMED Main.java
And run it with:
java -p <path-to-fx> --add-modules javafx.controls --add-exports javafx.controls/com.sun.javafx.scne.control.skin=ALL-UNNAMED Main
If your code is modular then you can get rid of the --add-modules and you must change ALL-UNNAMED to the name of your module. Plus, make sure to launch your application via --module (or -m). Note the -p above is shorthand for --module-path.
If you use a build tool (e.g., Maven, Gradle, etc.), then you'll have to lookup how to set these JVM arguments for that tool. You'll also have to take into account how you deploy your application. For instance, if you use jpackage then you can use its --java-options argument to set the --add-exports option for when your application is launched.
You may also need to tell your IDE that you are giving yourself access to the internal package. Otherwise, your IDE will likely yell at you for trying to use an inaccessible type.

In alfresco repository how to create link for one folder to another folder using java api

Hi guys i am beginner in alfresco.I have done many services such as creating folder,subfolder,uploading document,downloading document,creating permissions using cmis.
But i am not able to create link of one folder to another folder using cmis.
Somebody told me its not possible using cmis.
Somehow i got this link http://basanagowdapatil.blogspot.in/2011/01/code-for-creating-links-in-alfresco.html.
But this code is not in cmis.
I have never done this kind of programming.
Can somebody suggest me how to do this program in maven.
What dependency or jars i should add.
It is better if someone explain me step by step(in sense how to give authentication).
Thanks in advance
I got my answer and we can do the same using CMIS API.
import java.util.HashMap;
import java.util.Map;
import org.apache.chemistry.opencmis.client.api.Folder;
import org.apache.chemistry.opencmis.client.api.Session;
import org.apache.chemistry.opencmis.commons.PropertyIds;
import org.apache.chemistry.opencmis.commons.enums.BaseTypeId;
import org.apache.log4j.BasicConfigurator;
import org.apache.log4j.Logger;
import com.bizruntime.alfresco.session.CreateSession;
import com.bizruntime.alfresco.util.Config;
public class CreateLink {
static Logger log = Logger.getLogger(CreateLink.class);
public static void getLink() {
// creating Session
Session cmiSession = new CreateSession().getSession();
log.debug("Session Created...");
Map<String,Object> properties = new HashMap<>();
properties.put(PropertyIds.BASE_TYPE_ID, BaseTypeId.CMIS_ITEM.value());
// Define a name and description for the link
properties.put(PropertyIds.NAME, Config.getConfig().getProperty("nameOfLink"));
properties.put("cmis:description", Config.getConfig().getProperty("linkDescription"));
properties.put(PropertyIds.OBJECT_TYPE_ID, "I:app:filelink");
// Define the destination node reference
properties.put("cm:destination", Config.getConfig().getProperty("destination-nodRef"));
// Choose the folder where the link to be create
Folder rootFoler = cmiSession.getRootFolder();
Folder targerFolder = (Folder) cmiSession.getObjectByPath(rootFoler.getPath() + Config.getConfig().getProperty("targetFolder"));
cmiSession.createItem(properties, targerFolder);
log.info("Link Created Successfully....");
}
public static void main(String[] args) {
BasicConfigurator.configure();
CreateLink cl = new CreateLink();
cl.getLink();
}
}
Code for creating folder link:
import java.util.HashMap;
import java.util.Map;
import org.apache.chemistry.opencmis.client.api.Folder;
import org.apache.chemistry.opencmis.client.api.Session;
import org.apache.chemistry.opencmis.commons.PropertyIds;
import org.apache.chemistry.opencmis.commons.enums.BaseTypeId;
import org.apache.log4j.BasicConfigurator;
import org.apache.log4j.Logger;
import com.bizruntime.alfresco.session.CreateSession;
import com.bizruntime.alfresco.util.Config;
public class CreateLink {
static Logger log = Logger.getLogger(CreateLink.class);
public static void getLink() {
// creating Session
Session cmiSession = new CreateSession().getSession();
log.debug("Session Created...");
Map<String,Object> properties = new HashMap<>();
properties.put(PropertyIds.BASE_TYPE_ID, BaseTypeId.CMIS_ITEM.value());
// Define a name and description for the link
properties.put(PropertyIds.NAME, Config.getConfig().getProperty("nameOfLink"));
properties.put("cmis:description", Config.getConfig().getProperty("linkDescription"));
properties.put(PropertyIds.OBJECT_TYPE_ID, "I:app:filelink");
// Define the destination node reference
properties.put("cm:destination", Config.getConfig().getProperty("destination-nodRef"));
// Choose the folder where the link to be create
Folder rootFoler = cmiSession.getRootFolder();
Folder targerFolder = (Folder) cmiSession.getObjectByPath(rootFoler.getPath() + Config.getConfig().getProperty("targetFolder"));
cmiSession.createItem(properties, targerFolder);
log.info("Link Created Successfully....");
}
public static void main(String[] args) {
BasicConfigurator.configure();
CreateLink cl = new CreateLink();
cl.getLink();
}
}

Reading from PubsubIO writing to DatastoreIO

Is it possible to create a pipeline that reads data from Pub/Sub and writes to Datastore? In my code I specify the PubsubIO as the input, and apply windowing to get a bounded PCollection, but it seems that it is not possible to use the DatastoreIO.writeTo with the options.setStreaming as true, while that is required in order to use PubsubIO as input. Is there a way around this? Or is it simply not possible to read from pubsub and write to datastore?
Here's my code:
DataflowPipelineOptions options = PipelineOptionsFactory.create()
.as(DataflowPipelineOptions.class);
options.setRunner(DataflowPipelineRunner.class);
options.setProject(projectName);
options.setStagingLocation("gs://my-staging-bucket/staging");
options.setStreaming(true);
Pipeline p = Pipeline.create(options);
PCollection<String> input = p.apply(PubsubIO.Read.topic("projects/"+projectName+"/topics/event-streaming"));
PCollection<String> inputWindow = input.apply(Window.<String>into(FixedWindows.of(Duration.standardSeconds(5))).triggering(AfterPane.elementCountAtLeast(1)).discardingFiredPanes().withAllowedLateness(Duration.standardHours(1)));
PCollection<String> inputDecode = inputWindow.apply(ParDo.of(new DoFn<String, String>() {
private static final long serialVersionUID = 1L;
public void processElement(ProcessContext c) {
String msg = c.element();
byte[] decoded = Base64.decodeBase64(msg.getBytes());
String outmsg = new String(decoded);
c.output(outmsg);
}
}));
PCollection<DatastoreV1.Entity> inputEntity = inputDecode.apply(ParDo.of(new CreateEntityFn("stream", "events")));
inputEntity.apply(DatastoreIO.writeTo(datasetid));
p.run();
And this is the exception I get:
Exception in thread "main" java.lang.UnsupportedOperationException: The Write transform is not supported by the Dataflow streaming runner.
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner$StreamingWrite.apply(DataflowPipelineRunner.java:488)
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner$StreamingWrite.apply(DataflowPipelineRunner.java:480)
at com.google.cloud.dataflow.sdk.runners.PipelineRunner.apply(PipelineRunner.java:74)
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner.apply(DataflowPipelineRunner.java:314)
at com.google.cloud.dataflow.sdk.Pipeline.applyInternal(Pipeline.java:358)
at com.google.cloud.dataflow.sdk.Pipeline.applyTransform(Pipeline.java:267)
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner.apply(DataflowPipelineRunner.java:312)
at com.google.cloud.dataflow.sdk.Pipeline.applyInternal(Pipeline.java:358)
at com.google.cloud.dataflow.sdk.Pipeline.applyTransform(Pipeline.java:267)
at com.google.cloud.dataflow.sdk.values.PCollection.apply(PCollection.java:159)
at my.own.project.google.dataflow.EventStreamingDataflow.main(EventStreamingDataflow.java:104)
The DatastoreIO sink is not currently supported in the streaming runner. To write to Datastore from a streaming pipeline, you can make direct calls to the Datastore API from a DoFn.
Ok, after a lot of banging my head against the wall, I finally got it working. Like danielm suggested, I'm making calls to the Datastore API from a ParDo DoFn. One problem was, that I didn't realize there is a separate API for using the Cloud Datastore outside from AppEngine. (com.google.api.services.datastore... vs. com.google.appengine.api.datastore...). Another problem was that apparently there is some kind of bug in the latest version of the Cloud Datastore API (google-api-services-datastore-protobuf v1beta2-rev1-4.0.0, I got an IllegalAccessError), I resolved that by using an older version (v1beta2-rev1-2.1.2).
So, here's my working code:
import com.google.cloud.dataflow.sdk.Pipeline;
import com.google.cloud.dataflow.sdk.io.PubsubIO;
import com.google.cloud.dataflow.sdk.options.DataflowPipelineOptions;
import com.google.cloud.dataflow.sdk.options.PipelineOptionsFactory;
import com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner;
import com.google.cloud.dataflow.sdk.transforms.DoFn;
import com.google.cloud.dataflow.sdk.transforms.ParDo;
import com.google.cloud.dataflow.sdk.values.PCollection;
import com.google.api.services.datastore.DatastoreV1.*;
import com.google.api.services.datastore.client.Datastore;
import com.google.api.services.datastore.client.DatastoreException;
import com.google.api.services.datastore.client.DatastoreFactory;
import static com.google.api.services.datastore.client.DatastoreHelper.*;
import java.security.GeneralSecurityException;
import java.io.IOException;
import org.json.simple.JSONObject;
import org.json.simple.parser.JSONParser;
import org.json.simple.parser.ParseException;
//--------------------
public static void main(String[] args) {
DataflowPipelineOptions options = PipelineOptionsFactory.create()
.as(DataflowPipelineOptions.class);
options.setRunner(DataflowPipelineRunner.class);
options.setProject(projectName);
options.setStagingLocation("gs://my-staging-bucket/staging");
options.setStreaming(true);
Pipeline p = Pipeline.create(options);
PCollection<String> input = p.apply(PubsubIO.Read.topic("projects/"+projectName+"/topics/my-topic-name"));
input.apply(ParDo.of(new DoFn<String, String>() {
private static final long serialVersionUID = 1L;
public void processElement(ProcessContext c) throws ParseException, DatastoreException {
JSONObject json = (JSONObject)new JSONParser().parse(c.element());
Datastore datastore = null;
try {
datastore = DatastoreFactory.get().create(getOptionsFromEnv()
.dataset(datasetid).build());
} catch (GeneralSecurityException exception) {
System.err.println("Security error connecting to the datastore: " + exception.getMessage());
} catch (IOException exception) {
System.err.println("I/O error connecting to the datastore: " + exception.getMessage());
}
Key.Builder keyBuilder = makeKey("my-kind");
keyBuilder.getPartitionIdBuilder().setNamespace("my-namespace");
Entity.Builder event = Entity.newBuilder()
.setKey(keyBuilder);
event.addProperty(makeProperty("my-prop",makeValue((String)json.get("my-prop"))));
CommitRequest commitRequest = CommitRequest.newBuilder()
.setMode(CommitRequest.Mode.NON_TRANSACTIONAL)
.setMutation(Mutation.newBuilder().addInsertAutoId(event))
.build();
if(datastore!=null){
datastore.commit(commitRequest);
}
}
}));
p.run();
}
And the dependencies in pom.xml:
<dependency>
<groupId>com.google.cloud.dataflow</groupId>
<artifactId>google-cloud-dataflow-java-sdk-all</artifactId>
<version>[1.0.0,2.0.0)</version>
</dependency>
<dependency>
<groupId>com.google.apis</groupId>
<artifactId>google-api-services-datastore-protobuf</artifactId>
<version>v1beta2-rev1-2.1.2</version>
</dependency>
<dependency>
<groupId>com.google.http-client</groupId>
<artifactId>google-http-client</artifactId>
<version>1.17.0-rc</version>
</dependency>
<!-- Some more.. like JUnit etc.. -->

servlet has to be reloaded everyday

I have created a servlet to access a database and giving response to a BB application...it was running fine during development...but after loading it on a tomcat server 6.0 after goining live the servlet has to be reloaded every morning on the tomcat server....after that it works fine during the whole day..but the next morning when i request something it gives a blank page as response and my server admin tells the servlet has to be reloaded ...
other application hosted on the server are working fine and do not need a restart...
what might be the problem....
adding the code ..if it helps
package com.ams.servlets;
import java.io.*;
import javax.servlet.*;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.sql.*;
import com.cms.dbaccess.DataAccess;
import com.cms.utils.ApplicationConstants;
import com.cms.utils.ApplicationHelper;
import java.sql.ResultSet;
public class BBRequestProcessorServlet extends HttpServlet {
/**
*
*/String userString;
private static final long serialVersionUID = 1L;
String jsonString = "";
ResultSet rs = null;
Connection connection = null;
Statement statement=null;
public enum db_name
{
//Test
resource_management_db,osms_inventory_db;
}
public void init(ServletConfig config)throws ServletException
{
super.init(config);
System.out.println("Inside init");
}
public void doGet(HttpServletRequest request,HttpServletResponse response)throws ServletException,IOException
{
try{
connection = DataAccess.connectToDatabase("xxx", connection);
statement = DataAccess.createStatement(connection);
statement = connection.createStatement(ResultSet.TYPE_SCROLL_INSENSITIVE, ResultSet.CONCUR_READ_ONLY);
rs = statement.executeQuery("query is here");
}
catch(SQLException e)
{
e.printStackTrace();
}
String db =request.getParameter("db");
System.out.println("DATABASE NAME :"+ db);
if(db.equalsIgnoreCase("xxx")){
//Call to populate JSONArray with the fetch ResultSet data
jsonString = ApplicationHelper.populateJSONArray(rs);
}
response.setContentType(ApplicationConstants.JSON_CONTENT_TYPE);
PrintWriter out = response.getWriter();
out.print(jsonString);
out.flush();
out.close();
System.out.println("json object sent");
try {
rs.close();
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
the only errors i could find was
Jul 20, 2012 9:57:24 AM org.apache.catalina.loader.WebappClassLoader validateJarFile
INFO: validateJarFile(/usr/local/tomcat/apache-tomcat-6.0.20/webapps/MobileServlet /WEB-INF/lib/servlet-api.jar) - jar not loaded. See Servlet Spec 2.3, section 9.7.2. Offending class: javax/servlet/Servlet.class
The culprit is most likely the way how you handle external DB resources like the Connection. This problem can happen when you keep a DB Connection open all the time without closing it. When a DB Connection is been opened for a too long time, then the DB will timeout and reclaim it. This is most likely what was happening overnight.
You should redesign your DataAccess and BBRequestProcessorServlet that way so that you are nowhere keeping hold of Connection, Statement and ResultSet as an instance variable, or worse, a static variable of the class. The Connection should be created in the very same scope as where you're executing the SQL query/queries and it should be closed in the finally block of the very same try block as where you've created it.
By the way your jsonString should absolutely also not be declared as an instance variable of the servlet, it's not threadsafe this way.
See also:
Is it safe to use a static java.sql.Connection instance in a multithreaded system?
How do servlets work? Instantiation, sessions, shared variables and multithreading
As to the error which you're seeing in the log, you should definitely remove the offending JAR. See also How do I import the javax.servlet API in my Eclipse project?
I am guessing and will be more clear after seeing your logs.
Its seems like you have putted your servlet-api.jar in the WEB-INF lib but its already in tomcat's lib.

How do I use a custom realm with GlassFish 3.1?

I would like to use a custom realm with glassfish 3.1
I took the two file from this topic to try. Custom Glassfish Security Realm does not work (unable to find LoginModule)
The CustomRealm.java
package com.company.security.realm;
import com.sun.appserv.security.AppservRealm;
import com.sun.enterprise.security.auth.realm.BadRealmException;
import com.sun.enterprise.security.auth.realm.InvalidOperationException;
import com.sun.enterprise.security.auth.realm.NoSuchRealmException;
import com.sun.enterprise.security.auth.realm.NoSuchUserException;
import java.util.Enumeration;
import java.util.Properties;
import java.util.Vector;
public class CustomRealm extends AppservRealm
{
Vector<String> groups = new Vector<String>();
private String jaasCtxName;
private String startWith;
#Override
public void init(Properties properties)
throws BadRealmException, NoSuchRealmException {
jaasCtxName = properties.getProperty("jaas-context", "customRealm");
startWith = properties.getProperty("startWith", "z");
groups.add("dummy");
}
#Override
public String getAuthType()
{
return "Custom Realm";
}
public String[] authenticate(String username, char[] password)
{
// if (isValidLogin(username, password))
return (String[]) groups.toArray();
}
#Override
public Enumeration getGroupNames(String username)
throws InvalidOperationException, NoSuchUserException
{
return groups.elements();
}
#Override
public String getJAASContext()
{
return jaasCtxName;
}
public String getStartWith()
{
return startWith;
}
}
And the custom login module
package com.company.security.realm;
import com.sun.appserv.security.AppservPasswordLoginModule;
import com.sun.enterprise.security.auth.login.common.LoginException;
import java.util.Set;
import org.glassfish.security.common.PrincipalImpl;
public class CustomLoginModule extends AppservPasswordLoginModule
{
#Override
protected void authenticateUser() throws LoginException
{
_logger.info("CustomRealm : authenticateUser for " + _username);
final CustomRealm realm = (CustomRealm)_currentRealm;
if ( (_username == null) || (_username.length() == 0) || !_username.startsWith(realm.getStartWith()))
throw new LoginException("Invalid credentials");
String[] grpList = realm.authenticate(_username, getPasswordChar());
if (grpList == null) {
throw new LoginException("User not in groups");
}
_logger.info("CustomRealm : authenticateUser for " + _username);
Set principals = _subject.getPrincipals();
principals.add(new PrincipalImpl(_username));
this.commitUserAuthentication(grpList);
}
}
I added as well the module to the conf file
customRealm {
com.company.security.realm.CustomLoginModule required;
};
And I copy my 2 .class in the glassfish3/glassfish/domains/domain1/lib/classes/
as well as glassfish3/glassfish/lib
Everytime I want to create a new realm I have got the same error.
./asadmin --port 4949 create-auth-realm --classname com.company.security.realm.CustomRealm --property jaas-context=customRealm:startWith=a customRealm
remote failure: Creation of Authrealm customRealm failed. com.sun.enterprise.security.auth.realm.BadRealmException: java.lang.ClassNotFoundException: com.company.security.realm.CustomRealm not found by org.glassfish.security [101]
com.sun.enterprise.security.auth.realm.BadRealmException: java.lang.ClassNotFoundException: com.company.security.realm.CustomRealm not found by org.glassfish.security [101]
Command create-auth-realm failed.
I think i dont really understand how to add in the proper way my two files to glassfish.
This two files are created and compile from eclipse. I create a java project suctom login.
Someone can help ?
Thx a lot in advance,
loic
Did you package it as an OSGi module (see the answer in the post you referenced)? If so, don't copy the jar file into $GF_HOME/lib or anything, instead deploy it as an OSGi module:
asadmin deploy --type osgi /path/to/CustomRealm.jar
Then add the login.conf settings. To be on the safe side, I'd restart GF (asadmin restart-domain), then you can create the realm with the command you have there.

Resources