Is pax-exam supposed to simply incorporate ordinary dependencies in the probe bundle? - pax-exam

If I have a dependency of scope 'test' that is not an OSGi bundle, should I expect pax-exam to simply incorporate it in the probe bundle, or do I need to explicitly wrap it?
I have a case where neither approach is working., and I'm trying to diagnose the problem.

The probe bundle only includes the classes from the class path component containing the test class (usually just what's in src/test/java of your project).
You can customize the probe, or simply provision your test dependencies as separate bundle. The probe imports all packages dynamically.

Related

java.lang.LinkageError: loader constraint violation: loader

Here I am getting below error while connect using HttpPost,
Caused by: java.lang.LinkageError: loader constraint violation: loader
(instance of org/jboss/osgi/framework/internal/HostBundleClassLoader)
previously initiated loading for a different type with name
"org/apache/http/client/methods/HttpPost"
And I am using OSGI bundle so I have added all required dependent files.
So can anyone help me to resolve it?
The Java language is based on a single namespace. That is, the language is built around the concept that a class name is used only once. Classloaders were designed to load code over the internet but accidentally allowed the same class name to be used by 2 class loaders.
In OSGi, each bundle has a class loader that directly loads the classes from its own bundle but uses the class loader of other bundles for any imported classes.
In such a mesh of class loaders, you get the situation that you can load a class C from a Bundle that references a class X and a class Y loaded from other class loaders. Since they have different names that is ok. However, X could refer to class Z and Y could refer to Z, and they could come from different loaders. The original class C from Bundle A, therefore, can see Z from two different class loaders. And is a Linkage Error.
This mesh of classloaders works very well when all bundles are correct, you should never get this kind of errors when you do not hack your bundles. These errors are inevitably caused by complex setups that do not follow the OSGi rules and maintain the Bundle's manifest by hand.
In this case, the class name that can be seen multiple times is org.apache.http.client.methods.HttpPost. So you have a setup where there are multiple bundles exporting this class, which is the first place to look. Since you could start the bundle, the metadata is wrong. OSGi has special metadata that makes this error detected before you start the bundle, the so-called uses constraints.
On Apache Felix, you get an extensive analysis of the problem. If you could run your code on Apache Felix, that would be the easiest route. Looking at your error, you seem to be running on JBoss. They always have played a bit loose with the OSGi rules to make it easier to run enterprise software. Software that rarely does the work to provide OSGi metadata and is well known for its class loader hacks. (A lot of people are only after the Java Module System starting to understand what OSGi was doing and needed.)

Silverstripe env variable value in config

I am trying to figure out if SilverStripe 4.2 supports referencing environment variables in the config files in a similar fashion Symfony does.
So far I was able to find the class responsible for building configs, which doesn't seem to have this functionality.
I thought of injecting another layer that would parse the YAML files and process the environment references, but it seems that you cannot extend a service since there is no Dependency Injection container available?
Is there maybe a different way to do this? All that I am trying to do is use environment variables in YAML config files.
You can use environment variables in YAML config provided it's config for the Injector class. You can't use them outside of Injector config (as of 4.2).
You can wrap them in backticks for them to be parsed into config:
SilverStripe\Core\Injector\Injector:
MyServiceClass:
properties:
MyProperty: '`ENV_VAR_HERE`'

Configure an sbt build for a test framework of your own

In one vein of thought I would like to configure a build with a custom task that will serve instead of the test default task. I will not be registering it as an "sbt test framework" as I'd like to avoid the syntax, limitations and history that comes with that. For example I'd like more flexibility in formatting, storing and sending test results.
Would it be possible introducing an sbt task to mimic sbt's default test task, in the following essential senses:
depend on compilation
be a dependency for other tasks such as artifact publishing tasks, and have them depend on a success return value from it
be able to provide it with library dependencies which will be excluded from final publishable artifacts
be able to control it just like any other task dependency
etc
Alternatively, are you able to delineate the structure of the mostly undocumented sbt TestFramework interface to a level that it's straightforward having your own formatting, test output logic, and test results publishing code, in your TestFramework implementation? I have mostly failed locating scalaTest's implementation of it.
Thanks in advance for you answers on both techniques.

Does glassfish resolve the path to libraries correctly?

I am interested in this question due to the problem I described here. How does Glassfish look for the required classes anyway? Suppose there are two libraries in pom.xml of the application (in dependencies), one is declared with scope provided, another is declared with standard scope.
Therefore, I have two libraries - A.jar is in Glassfish lib folder, B.jar is in WEB-INF/lib of the war module that I deploy.
What is the order of resolving the dependencies here? I assume that:
First look in the WEB-INF/lib folder if any jar matches the class.
After that look in Glassfish/lib folder if any jar matches the class.
Is that correct? Is the situation when class in A.jar refences a class in B.jar, legal for such a configuration, and vice versa?
To be more specific, I have Glassfish 2.1.
According to class loader documentation for GF2 I would say vice versa.
Note that the class loader hierarchy is not a Java inheritance hierarchy, but a delegation hierarchy. In the delegation design, a class loader delegates class loading to its parent before attempting to load a class itself. If the parent class loader cannot load a class, the class loader attempts to load the class itself. In effect, a class loader is responsible for loading only the classes not available to the parent. Classes loaded by a class loader higher in the hierarchy cannot refer to classes available lower in the hierarchy.
note: Related documentation for GF3.1 is here and here
However you may influence the behavior through glassfish specific descriptor with
<class-loader delegate="true/false"/>
You can find more about it following the first link

Apache Karaf - bundle starts but does nothing?

I'm new to Karaf. I have a jar that has a class App with a method main. When I drop the jar in the The Karaf log service console says the bundle is started but nothing seems to happen. The first thing (the jar) does is a simple database write so I can see if it's running (no log file is generated although one is expected).
The jar depends on lots of other jars. Our sysadmin will not install Maven on the production servers. Where does one put helper jars (like mysql-connector-java-[version].jar)?
Does Karaf use the Manifest file to find the main class? Do I have to implement some special interface or something?
thanks for any help.
As Karaf is a OSGi Container, you should first read some stuff on how to write proper OSGi bundles.
First of all you'll need a Activator that'll start your bundle (just like a main). A Main Class is never interpreted. Yes Karaf, as it is a OSGi container, does "read" the Manifest, but to make sure first it's a proper OSGi bundle second how the resolving should take place by reading Package-Import/-Export.
Regarding the "Packaging" - using lot's of other jar's/bunldes - , you'd either can built a custom Karaf (read the Karaf documentation on how to do this) or create a KAR for your Bundles containing your bundles and a feature.xml (again take a look at the documentation at Karaf)

Resources