Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I need to do a junit testing for my spring application and it is done with mongo database.
I have no previous experience in embedding the junit testing with spring and mongodb.
Any reply, will great helpful to me...
Thanks with Regards.
I would start by looking at the JUnit documentation, specifically, I would start with Assertions. When testing with a dependency injection framework (e.g. Spring) a mocking framework is essential. Check out EasyMock or Mockito.
I use Spring mongo template in a Spring MVC app and JUnit 4.8.2 for unit tests.
Just create a bean for your mongoTemplate and use Autowired to inject it in your classes. As for the tests, follow these steps:
1.Create a new JUnit test case (right-click on your class in the Package Explorer, then new->JUnit Test Case). It will create a test method for each method of your class you specify.
2.Now you'll have your tests at src/test/java and yor resources at src/test/resources. It is better if you create a spring config file just for the tests so you can point the tests to a local mongodb instance and your application to perhaps a Development mongoDB instance. So create a the config file at src/test/resources and name it testSpringConfig.xml or whatever and create the bean there:
<mongo:db-factory dbname="myDB" host="localhost"
username="myDbUser" password="myPass"/>
<beans:bean id="mongoTemplate"
class="org.springframework.data.mongodb.core.MongoTemplate">
<beans:constructor-arg name="mongoDbFactory" ref="mongoDbFactory" />
</beans:bean>
<beans:bean id="mongoTemplateLibrary"
class="org.springframework.data.mongodb.core.MongoTemplate">
<beans:constructor-arg name="mongoDbFactory" ref="mongoDbFactory" />
</beans:bean>
3.In your test class, use annotations to reference your config file:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = {"/testSpringConfig.xml"})
public class MyDaoTest {
4.Once you have your test class set up, inject the mongo template
#Autowired
#Qualifier("mongoTemplate")
private MongoTemplate mongoTemplate;
5.Now yo can use it to insert/remove/find/update directly to mongo (although this would be a integration test more than a unit test)
For example, you can remove the objects you inserted in your tests using the Tear Down method:
#After
public void tearDown() {
mongoTemplate.remove(myObject);
}
pom.xml file Config as follows,
<properties>
....
<junit.version>4.10</junit.version>
....
</properties>
<dependencies>
.....
<!-- Test dependencies -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version><!-- JUNIT CHANGES -->
<scope>test</scope>
</dependency>
<dependency>
<groupId>de.flapdoodle.embed</groupId>
<artifactId>de.flapdoodle.embed.mongo</artifactId>
<version>1.26</version>
<scope>test</scope>
</dependency>
<!-- Mockito --><!-- JUNIT MOCKITO DEPENDENCY ADDED -->
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>1.8.4</version>
</dependency>
<!-- Mockito -->
......
</dependencies>
<build>
....
<plugins>
......
<!-- JUNIT PLUGIN CODE ADDED-->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.5</source>
<target>1.5</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.14</version>
<configuration>
<useSystemClassLoader>false</useSystemClassLoader>
</configuration>
<dependencies>
<dependency>
<groupId>org.apache.maven.surefire</groupId>
<artifactId>surefire-junit47</artifactId>
<version>2.14</version>
</dependency>
</dependencies>
</plugin>
<!-- END OF JUINT CODE -->
.....
</plugins>
<sourceDirectory>src/main/java</sourceDirectory>
<testSourceDirectory>src/test/junit</testSourceDirectory>
</build>....
After setting up the pom cofigurations, do the test case code as by the following steps
Assuming as you are using IDE's like STS or else, just right click on the corresponding class and click on Junit Test Case and create class under the src/test/java path.
For implementing for spring controller classes first we initialize all the objects like initialize values from the test case code of the corresponding class in the setUp() method.
Then call the corresponding methods from the test case and do the appropriate junit assertion check-in if the corresponding method returns values like string/int/boolean etc.,
Like below:
#Test
public void testSave() {
MessageValue messageValue = saveController.saveValue(value+"::Test", model);
Query query = new Query(Criteria.where("value").is("Test").and("_id").is(id));
SaveValue saveValueThis = template.findOne(query, Save.class);
assertEquals("Success", messageValue.getMessage());
// delete the saved value at the end of test case
testDelete();
}
If it returns some UI page that is redirection to some other screen then check like below,
mockMvc = MockMvcBuilders.standaloneSetup(yourController).build();
mockMvc.perform(get("/request/url").accept(MediaType.ALL))
.andExpect(status().isOk());
Finally in the test case tearDown() method reset the initialized values.
Related
I'm trying to create a JFX11 self-containing jar using maven dependencies. From the research I've done, it seems the best way to do this is through the maven shade plugin. However, When I run it, I get the this error:
Error: JavaFX runtime components are missing, and are required to run this application
I don't understand why this is happening. What am I messing up? Is there a better way to do this? I've also tried the maven assembly plugin with the same message.
pom file for reference
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>Application</groupId>
<artifactId>Main</artifactId>
<packaging>jar</packaging>
<version>1.0-SNAPSHOT</version>
<name>SpaceRunner</name>
<url>http://maven.apache.org</url>
<dependencies>
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-controls</artifactId>
<version>11</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
<configuration>
<release>10</release>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.6.0</version>
<executions>
<execution>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
<configuration>
<mainClass>Application.Main</mainClass>
</configuration>
</plugin>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>
Application.Main
</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>Application.Main</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
UPDATE 10/2021
Since JavaFX 16 a warning is displayed when JavaFX doesn't run on the module path, which is the case of an uber/fat jar:
$ java -jar myFatJar-1.0-SNAPSHOT.jar
Oct 02, 2021 1:45:21 PM com.sun.javafx.application.PlatformImpl startup
WARNING: Unsupported JavaFX configuration: classes were loaded from 'unnamed module #14c24f4c'
Also, you get a warning from the shade plugin itself:
[WARNING] Discovered module-info.class. Shading will break its strong encapsulation.
While these warnings can be initially ignored, there is a reason for them.
As explained in this CSR:
JavaFX is built and distributed as a set of named modules, each in its own modular jar file, and the JavaFX runtime expects its classes to be loaded from a set of named javafx.* modules, and does not support loading those modules from the classpath.
And:
when the JavaFX classes are loaded from the classpath, it breaks encapsulation, since we no longer get the benefit of the java module system.
Therefore, even this widely accepted answer explains how can an uber/fat jar can be created on Maven projects, its use is discouraged, and other modern alternatives to distribute your application, like jlink, jpackage or native-image, should be used.
ORIGINAL ANSWER
This answer explains why a fat/uber jar fails on JavaFX 11. In short:
This error comes from sun.launcher.LauncherHelper in the java.base module. The reason for this is that the Main app extends Application and has a main method. If that is the case, the LauncherHelper will check for the javafx.graphics module to be present as a named module. If that module is not present, the launch is aborted.
And already proposes a fix for Gradle.
For Maven the solution is exactly the same: provide a new main class that doesn't extend from Application.
You will have new class in your application package (bad name):
// NewMain.java
public class NewMain {
public static void main(String[] args) {
Main.main(args);
}
}
And your existing Main class, as is:
//Main.java
public class Main extends Application {
#Override
public void start(Stage stage) {
...
}
public static void main(String[] args) {
launch(args);
}
}
Now you need to modify your pom and set your main class for the different plugins:
<mainClass>application.NewMain</mainClass>
Platform-specific Fat jar
Finally, with the shade plugin you are going to produce a fat jar, on your machine.
This means that, so far, your JavaFX dependencies are using a unique classifier. If for instance you are on Windows, Maven will be using internally the win classifier. This has the effect of including only the native libraries for Windows.
So you are using:
org.openjfx:javafx-controls:11
org.openjfx:javafx-controls:11:win
org.openjfx:javafx-graphics:11
org.openjfx:javafx-graphics:11:win <-- this contains the native dlls for Windows
org.openjfx:javafx-base:11
org.openjfx:javafx-base:11:win
Now, if you produce the fat jar, you will bundle all those dependencies (and those other regular third party dependencies from your project), and you will be able to run your project as:
java -jar myFatJar-1.0-SNAPSHOT.jar
While this is very nice, if you want to distribute you jar, be aware that this jar is not cross-platform, and it will work only on your platform, in this case Windows.
Cross-Platform Fat Jar
There is a solution to generate a cross-platform jar that you can distribute: include the rest of the native libraries of the other platforms.
This can be easily done, as you just need to include the graphics module dependencies for the three platforms:
<dependencies>
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-controls</artifactId>
<version>11</version>
</dependency>
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-graphics </artifactId>
<version>11</version>
<classifier>win</classifier>
</dependency>
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-graphics </artifactId>
<version>11</version>
<classifier>linux</classifier>
</dependency>
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-graphics </artifactId>
<version>11</version>
<classifier>mac</classifier>
</dependency>
</dependencies>
Size
There is a main issue with this approach: the size. As you can see in this other answer, if you use the WebView control, you will be bundling around 220 MB due to the WebKit native libraries.
I'm running into an error running the native image produced by GraalVM (latest GraalVM built on JDK 11) via the Gluon Client Plugin.
javafx.fxml.LoadException: Error resolving onAction='#loginAction', either the event handler is not in the Namespace or there is an error in the script. fxml/LoginScreen.fxml:17
The compililation step works fine:
mvn clean client:build
I see the binary in a folder called "projectname/target/client/x86_64-linux/binaryname"
The above error produced when I run the executable via "./binaryname"
The FXML line of code which its complaining about on line 17 is:
<Button fx:id="_loginButton" layoutX="516.0" layoutY="174.0" mnemonicParsing="false" onAction="#loginAction" prefHeight="28.0" prefWidth="94.0" text="Login" />
The backing code logic are as follows and marked with #FXML:
#FXML
void loginAction(ActionEvent event) throws InterruptedException {
LoginService loginservice = new LoginService(_usernameTextField.getText(), _passwordTextField.getText());
According to a JavaFX common errors list the problem is usually because the onAction event does not have the same name as specified in the controller - Introduction to JavaFX
for Beginner Programmers - Pg 27 . However this is not the case, my program's naming is accurate. Using the JavaFX maven plugin (seperate from the GluonClient) using
maven javafx:run
the program starts up correctly and works as expected. If I need to post more information, please let me know.
Here is my pom.xml (I only replaced only the name of my package below)
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.demo</groupId>
<artifactId>com-demo-management-ui</artifactId>
<version>0.0.1-SNAPSHOT</version>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>11</maven.compiler.source>
<maven.compiler.target>11</maven.compiler.target>
<client.plugin.version>0.1.26</client.plugin.version>
</properties>
<dependencies>
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-controls</artifactId>
<version>11</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.openjfx/javafx-fxml -->
<dependency>
<groupId>org.openjfx</groupId>
<artifactId>javafx-fxml</artifactId>
<version>11</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.httpcomponents/httpclient -->
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.12</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.google.code.gson/gson -->
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.6</version>
</dependency>
<!-- https://mvnrepository.com/artifact/commons-io/commons-io -->
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.6</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<release>11</release>
</configuration>
</plugin>
<plugin>
<groupId>org.openjfx</groupId>
<artifactId>javafx-maven-plugin</artifactId>
<version>0.0.3</version>
<configuration>
<mainClass>com.demo.Launcher</mainClass>
</configuration>
</plugin>
<plugin>
<groupId>com.gluonhq</groupId>
<artifactId>client-maven-plugin</artifactId>
<version>${client.plugin.version}</version>
<configuration>
<!-- Uncomment to run on iOS: -->
<!-- <target>ios</target> -->
<mainClass>com.demo.Launcher</mainClass>
<graalvmHome>/opt/graalvm-ce-java11-20.2.0-dev/</graalvmHome>
</configuration>
</plugin>
</plugins>
</build>
<pluginRepositories>
<pluginRepository>
<id>gluon-releases</id>
<url>http://nexus.gluonhq.com/nexus/content/repositories/releases/</url>
</pluginRepository>
</pluginRepositories>
And finally, here is the code where I set the controller (This is a method call, which I exchange my views as I need them, thus controller is passed as an argument in the creating a view):
FXMLLoader fxmlLoader = new FXMLLoader(getClass().getResource("/fxml/"+baseController.getFxmlFile()));
fxmlLoader.setController(baseController);
Parent parent = fxmlLoader.load();
Scene scene = new Scene(parent);
Stage stage = new Stage();
stage.setFullScreen(fullScreen);
stage.setMaximized(setMaximized);
stage.setScene(scene);
stage.show();
If you have a look at HelloFXML sample in the Client samples repository, you will see it uses the typical FXML file with a controller:
<AnchorPane fx:id="pane" ... fx:controller="hellofx.HelloController">
In your case, you don't have the controller in the FXML file, but you provide it like:
fxmlLoader.setController(new hellofx.HelloController());
As you know, the FXMLLoader uses reflection to instantiate controller, controls and methods that are found while parsing the FXML file.
Either way, when you click the button that triggers the loginAction method, the FXMLLoader process that with this call:
MethodHelper.invoke(method, controller, params);
that uses reflection to handle such event.
With GraalVM, reflection is an issue, and you have to "help" it a little bit, by providing the classes/methods/fields that are going to be used reflectively at some point. Find more about it here.
The Client plugin already takes care for you of adding the JavaFX core classes and methods. You can see what's added in target/client/x86_64-darwin/gvm/reflectionconfig-x86_64-darwin.json.
However, your custom classes have to be added to that file. There are two ways you can do that:
As in HelloFXML, via configuration/reflectionList, you will provide the custom class/es that will be used reflectively:
<plugin>
<groupId>com.gluonhq</groupId>
<artifactId>client-maven-plugin</artifactId>
<version>${client.plugin.version}</version>
<configuration>
<reflectionList>
<list>hellofx.HelloController</list> <!-- your custom classes -->
</reflectionList>
<mainClass>${mainClassName}</mainClass>
</configuration>
</plugin>
This has the effect of opening all class methods/fields to reflection. You will see the result in the json file as:
{
"name" : "hellofx.HelloController",
"allDeclaredConstructors" : true,
"allPublicConstructors" : true,
"allDeclaredFields" : true,
"allPublicFields" : true,
"allDeclaredMethods" : true,
"allPublicMethods" : true
}
...
This should be enough to fix your issue.
Via config files. As you can read in the Client's documentation, you can add a config file (reflectionconfig.json) to META-INF/substrate/config instead:
[
{
"name":"hellofx.HelloController",
"methods":[{"name":"loginAction","parameterTypes":["javafx.event.ActionEvent"] }]
}
]
This will fix it as well. Of course, it might require adding other methods you have in the controller too (like initialize).
This will open to reflection only this method, so it has a lower impact in memory footprint, and follows what the plugin does with the JavaFX core classes.
I'm new to Spring Reactor. I've been trying to understand how the ConnectableFlux class works. I've read the docs and seen examples posted online but am stuck on an issue.
Can someone tell me why the connect() method is blocking? I don't see anything in the documentation that says it should block..especially since it returns a Disposable for later use.
Given my example code below, I never get past the connect() method.
I'm trying to basically simulate the old style Listener interface paradigm I've used many times in the past. I want to learn how to recreate a Service class & Listener architecture using Reactive streams. Where I have a simple Service class and it has a method called "addUpdateListener(Listener l)" and then when my service class "doStuff()" method it triggers some events to be passed to any listeners.
I should say that I will be writing an API for others to use, so when I say Service class I don't mean the #Service in Spring terms. It will be a plain java singleton class.
I'm just using Spring Reactor for the Reactive Streams. I was also looking at RxJava.. but wanted to see if Spring Reactor Core would work.
I was starting with a test class below just to understand the library syntax and then got stuck on the blocking issue.
I think what I'm looking for is described here: Multiple Subscribers
UPDATE: Running my code through a debugger, the code inside ConnectableFlux connect method, never returns. It hangs on the internal connect method and never returns from that method.
reactor.core.publisher.ConnectableFlux
public final Disposable connect() {
Disposable[] out = new Disposable[]{null};
this.connect((r) -> {
out[0] = r;
});
return out[0];
}
Any help would be great!
Here is my maven pom as well
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>SpringReactorTest</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.target>1.8</maven.compiler.target>
<maven.compiler.source>1.8</maven.compiler.source>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-bom</artifactId>
<version>Bismuth-RELEASE</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.3</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude>classworlds:classworlds</exclude>
<exclude>junit:junit</exclude>
<exclude>jmock:*</exclude>
<exclude>*:xml-apis</exclude>
<exclude>org.apache.maven:lib:tests</exclude>
<exclude>log4j:log4j:jar:</exclude>
</excludes>
</artifactSet>
<minimizeJar>true</minimizeJar>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import reactor.core.Disposable;
import reactor.core.publisher.ConnectableFlux;
import reactor.core.publisher.Flux;
import java.util.concurrent.TimeUnit;
import static java.time.Duration.ofSeconds;
/**
* Testing ConnectableFlux
*/
public class Main {
private final static Logger LOG = LoggerFactory.getLogger(Main.class);
public static void main(String[] args) throws InterruptedException {
Main m = new Main();
// Get the connectable
ConnectableFlux<Object> flux = m.fluxPrintTime();
// Subscribe some listeners
// Tried using a new thread for the subscribers, but the connect call still blocks
LOG.info("Subscribing");
Disposable disposable = flux.subscribe(e -> LOG.info("Fast 1 - {}", e));
Disposable disposable2 = flux.subscribe(e -> LOG.info("Fast 2 - {}", e));
LOG.info("Connecting...");
Disposable connect = flux.connect();// WHY does this block??
LOG.info("Connected..");
// Sleep 5 seconds
TimeUnit.SECONDS.sleep(5);
// Cleanup - Remove listeners
LOG.info("Disposing");
connect.dispose();
disposable.dispose();
disposable2.dispose();
LOG.info("Disposed called");
}
// Just create a test flux
public ConnectableFlux<Object> fluxPrintTime() {
return Flux.create(fluxSink -> {
while (true) {
fluxSink.next(System.currentTimeMillis());
}
}).doOnSubscribe(ignore -> LOG.info("Connecting to source"))
.sample(ofSeconds(2))
.publish();
}
}
Running the above code gives the following output.. it just prints the time in milliseconds until I Ctrl-C the process..
09:36:21.463 [main] DEBUG reactor.util.Loggers$LoggerFactory - Using Slf4j logging framework
09:36:21.478 [main] INFO Main - Subscribing
09:36:21.481 [main] INFO Main - Connecting...
09:36:21.490 [main] INFO Main - Connecting to source
09:36:23.492 [parallel-1] INFO Main - Fast 1 - 1589808983492
09:36:23.493 [parallel-1] INFO Main - Fast 2 - 1589808983492
09:36:25.493 [parallel-1] INFO Main - Fast 1 - 1589808985493
09:36:25.493 [parallel-1] INFO Main - Fast 2 - 1589808985493
09:36:27.490 [parallel-1] INFO Main - Fast 1 - 1589808987490
09:36:27.490 [parallel-1] INFO Main - Fast 2 - 1589808987490
09:36:29.493 [parallel-1] INFO Main - Fast 1 - 1589808989493
...
I received an answer from the Spring Reactor team and I'm just posting it here in case anyone else runs into this...
The crux of the issue is that you're entering an infinite loop in
Flux.create. The moment the flux gets subscribed, it will enter the
loop and never exit it, producing data as fast as the CPU can. With
Flux.create you should at least have a call to sink.complete() at some
point.
I suggest to experiment with eg. Flux.interval as a source for your
regular ticks, it will get rid of that extraneous complexity of
Flux.create, which puts you in charge of lower level concepts of
Reactive Streams (the onNext/onComplete/onError signals, that you'll
need to learn about, but maybe not just right now 😄 ).
As a side note, I would take into consideration that emulating a
listener-based API with Reactor (or RxJava) is not doing justice to
what reactive programming can do. It is a constrained use case that
will probably drive your focus and expectations away from the real
benefits of reactive programming
From a higher perspective:
The broad idea of ConnectableFlux#connect() is that you have a
"transient" source that you want to share between multiple
subscribers, but it gets triggered the moment someone subscribes to
it. So in order not to miss any event, you turn the source into a
ConnectableFlux, perform some set up (subscribe several subscribers)
and manually trigger the source (by calling connect()). It is not
blocking, and returns a Disposable` that represents the upstream
connection (in case you also want to manually cancel/dispose the whole
subscription).
PS: Bismuth is now clearly outdated, prefer using the latest
Dysprosium release train
Using Spring Cloud Contract 2.1.3.RELEASE with spring-boot 2.1.1.RELEASE, I have added the dependency and plugin per explained in the guide:
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-contract-verifier</artifactId>
<version>${spring-cloud-contract.version}</version>
<scope>test</scope>
</dependency>
and
<plugin>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-contract-maven-plugin</artifactId>
<version>${spring-cloud-contract.version}</version>
<extensions>true</extensions>
</plugin>
I have also added under: $rootDir/src/test/resources/contracts:
Groovy file:
package contracts
import org.springframework.cloud.contract.spec.Contract
Contract.make {
name("contract_updateNodeV4")
request {
method 'PUT'
url '/v4/nodes'
headers {
header 'Content-Type': 'application/vnd.org.springframework.cloud.contract.verifier.twitter-places-analyzer.v1+json'
}
body(file("updateNodeV4_request.json"))
}
response {
status OK()
body(file("updateNodeV4_response.json"))
}
}
And corresponding updateNodeV4_request.json and updateNodeV4_response.json (omitting their contents since these are large) valid JSON files.
When running mvn clean install I expected generated tests to be created (and fail for now) per the guide.
Instead I am getting the following error:
[ERROR] Failed to execute goal org.springframework.cloud:spring-cloud-contract-maven-plugin:1.0.0.RELEASE:generateStubs (default-generateStubs) on project xxx: Stubs could not be found: [C:\Users\xxx\git\xxx\target\stubs] .
[ERROR] Please make sure that spring-cloud-contract:convert was invoked
Most likely your contacts are not under the module's src/test/resources/contracts but under the root module's folder. If that's the case you need to tell the plugin that by seeing the contracts dir plugin property
I solved it by moving the plugin:
<plugin>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-contract-maven-plugin</artifactId>
<version>${spring-cloud-contract.version}</version>
<extensions>true</extensions>
</plugin>
From the root pom.xml to the specific module's pom.xml in which I have created the contracts in. Now it works as expected.
What is the correct way to configure Swagger2 with Spring MVC, not spring boot.
Currently I have following components added
Maven Dependency
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger2</artifactId>
<version>2.7.0</version>
</dependency>
<!--Dependency for swagger ui -->
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger-ui</artifactId>
<version>2.7.0</version>
</dependency>
In dispatcher servlet context
<bean id="swagger2Config" class="springfox.documentation.swagger2.configuration.Swagger2DocumentationConfiguration"/>
<mvc:resources location="/resources/" mapping="/resources/**"
order="1" />
<mvc:resources location="classpath:/META-INF/resources/" mapping="swagger-ui.html" />
<mvc:resources location="classpath:/META-INF/resources/webjars/" mapping="/webjars/**" />
<mvc:default-servlet-handler />
and added a configuration
#EnableSwagger2
public class SwaggerConfiguration extends WebMvcConfigurerAdapter{
#Override
public void addResourceHandlers(ResourceHandlerRegistry registry) {
registry.addResourceHandler("/swagger-ui.html")
.addResourceLocations("classpath:/META-INF/resources/");
registry.addResourceHandler("/webjars/**")
.addResourceLocations("classpath:/META-INF/resources/webjars/");
}
}
My application context is say sample, when I tried to launch the application
http://localhost:8080/sample/swagger-ui.html,
I'm getting error
Unable to infer base url. This is common when using dynamic servlet
registration or when the API is behind an API Gateway. The base url is the
root of where all the swagger resources are served. For e.g. if the api is
available at http://example.org/api/v2/api-docs then the base url is
http://example.org/api/. Please enter the location manually:
If I tried
http://localhost:8080/sample/v2/api-docs/
I'm getting exception
Handler processing failed; nested exception is java.lang.NoSuchMethodError
on org.springframework.web.util.UriComponentsBuilder.fromHttpRequest
Am I missing anything else here to make it work.
By the way, I'm using Spring 3.2.9 and Servlet API 2.5
Thank you all
It seems your #EnableSwagger2 annotation is not getting scanned during startup. Try and use the component scanning in your dispatcher servlet xml file. Something like below,
<context:component-scan base-package="your.package.name" />
Pass the package where the SwaggerConfiguration class is present. That should solve the issue.
Actually based on stackoverflow.com/questions/33518672/… and github.com/springfox/springfox/blob/v1.0.2/README.md, Spring 3.x is not compatible with Swagger 2. So I ended up using Swagger 1. I'm following the example given at github.com/martypitt/swagger-springmvc-example. I will update , once I successfully integrate Swagger 1