Spring Webflux and Observable responses not working - spring-mvc

I just created a simple Spring Boot application using spring-boot-starter-webflux with version 2.0.0.BUILD-SNAPSHOT which brings spring-webflux version 5.0.0.BUILD-SNAPSHOT and same for Spring Core, Beans, Context, etc.
If I create a simple #RestController and provide a #GetMapping that simply returns a Flux<String>, then everything works as expected.
However, if I change from Flux to RxJava's Observable, I get this error:
org.springframework.web.HttpMediaTypeNotAcceptableException: Could not find acceptable representation
Debugging a bit through the code I found that Jackson's ObjectMapper somehow registers Flux, Mono and the rest of reactive types in its typeFactory, so later the MappingJackson2HttpMessageConverter knows how to (de)-serialize them.
However, it's not the case when I use an Observable: I don't find the type Observable or Single registered in the ObjectMapper's type factory, so I get the aforementioned error.
Has anyone experienced this issue? Am I missing any dependency? Do I need to manually tell Jackson how to (de)-serialize from RxJava constructs? But why does Jackson already know about Flux and Mono?
Thanks for your help.
EDITED:
I'm using RxJava 1.2.7. Here is my pom.xml:
<dependencies>
<!-- Spring Boot -->
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
<version>2.0.0.BUILD-SNAPSHOT</version>
</dependency>
<!-- RxJava dependeencies -->
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava</artifactId>
<version>1.2.7</version>
</dependency>
</dependencies>
And here is the example of my controller code:
/**
* Get all the available tasks.
*
* #return the list of tasks
*/
#GetMapping(path = "/task")
#ResponseStatus(HttpStatus.OK)
public Observable<TaskView> allTasks(#AuthenticationPrincipal LoggedUserVO principal) {
return this.pocComponent.getAllTasks()
.map(t -> ViewConverter.convertFromTaskDocument(t, principal));
}

You are probably missing the following dependency to make it works:
<dependency>
<groupId>io.reactivex</groupId>
<artifactId>rxjava-reactive-streams</artifactId>
<version>1.2.1</version>
</dependency>
Changes in various RxJava 1.x versions made it challenging for us to support it out of the box, that's why we prefer to rely on the official RxJava -> Reactive Streams adapter. Notice that RxJava 2.x is supported without requiring additional dependencies (it is natively built on top of Reactive Streams).
I am going to update Spring WebFlux reference documentation to specify this is required to have RxJava 1.x support.

So, Flux and Mono are Spring names, as Spring has the Not-Invented-Here syndrome.
You can either register a handler that does the conversion to types that Spring knows about, use the AsyncResponse facility that Jersey provides, or add toBlocking everywhere.

Related

Axon framework- storing a list of events

I have a list of entities List<Entity> entitiesList. I need to publish and store the list of events for each of entity.
I have an aggregate for Entity , all necessary handlers, CreateEntityCommand and EntityCreatedEvent.
Currently what I do:
1. Create commands in the loop and send these commands via command gateway for each entity from entitiesList.
for (Entity entity : entitiesList) {
CreateEntityCommand createEntityCommand = new CreateEntityCommand();
… here I set command’s fields …
commandGateway.send(createEntityCommand);
}
Inside the aggregate I have
#CommandHandler
public EntityAggregate(CreateEntityCommand createAlertCommand) {
EntityCreatedEvent entityCreatedEvent = new EntityCreatedEvent();
…. here I set event’s fields
AggregateLifecycle.apply(entityCreatedEvent);
}
As the result, the events are created published and saved into the DomainEventEntry table inside the loop one by one.
If I have 10000 of entities – this process takes a lot of time …
My question is – how can I improve this process of creating, publishing and saving a list of entities ?
I use this version of axon:
<dependency>
<groupId>org.axonframework</groupId>
<artifactId>axon-spring-boot-starter</artifactId>
<version>4.3</version>
<exclusions>
<exclusion>
<groupId>org.axonframework</groupId>
<artifactId>axon-server-connector</artifactId>
</exclusion>
</exclusions>
</dependency>
SpringBoot configuration with annotation #SpringBootApplication.
I haven't configured anything specific around Axon.
What I believe you need is to parallelize the processing of the commands to speed up the work. There are two ways to achieve this:
Change the local CommandBus
Distributed your application
I am assuming you are on pointer 1, hence my answer will be tailored towards this.
When you have a single instance of an Axon application using Spring Boot a SimpleCommandBus will be auto configured for you. This does not provide any possibility for concurrent work. Thus configuring a different CommandBus bean should be the way to go.
I'd suggest to first start using the AsynchronousCommandBus. This implementation uses an Executor (which you can further configure if you so desire) to spin up threads to dispatch (and handle) each command going through.
If this is still to slow to your liking, I'd try out the DisruptorCommandBus (for specifics on what "Disruptor" is, you can look here). This CommandBus implementation will use two threads pools; one pool for handling commands and another for storing the events.
Lastly, if you are already working with a distributed version of the CommandBus (like Axon Server, or with the DistributedCommandBus), you will have to provide a CommandBus bean with the qualifier "localSegment" attached to it. For a quick overview of the command buses provided by Axon, I'd have a look at there Reference Guide (up here).

Google dataflow - Connecting to datastore using datastore options gives error

We are trying to connect to datastore services through a dataflow job written in java but we are facing issues due to datastore SDK error.
We are running the job with directrunner on local machine using eclipse.
Code:
import java.net.SocketTimeoutException;
import org.apache.beam.sdk.options.PipelineOptions;
import com.google.cloud.datastore.Datastore;
import com.google.cloud.datastore.DatastoreOptions;
public class StarterPipeline {
public interface StarterPipelineOption extends PipelineOptions {
}
#SuppressWarnings("serial")
public static void main(String[] args) throws SocketTimeoutException {
Datastore datastore = DatastoreOptions.getDefaultInstance().getService();
}
}
Error:
Exception in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Illegal character in path at index 45: https://datastore.googleapis.com/v1/projects/<!DOCTYPE html>
at com.google.datastore.v1.client.DatastoreFactory.validateUrl(DatastoreFactory.java:122)
at com.google.datastore.v1.client.DatastoreFactory.buildProjectEndpoint(DatastoreFactory.java:108)
at com.google.datastore.v1.client.DatastoreFactory.newRemoteRpc(DatastoreFactory.java:115)
at com.google.datastore.v1.client.DatastoreFactory.create(DatastoreFactory.java:65)
at com.google.cloud.datastore.spi.v1.HttpDatastoreRpc.<init>(HttpDatastoreRpc.java:71)
at com.google.cloud.datastore.DatastoreOptions$DefaultDatastoreRpcFactory.create(DatastoreOptions.java:61)
at com.google.cloud.datastore.DatastoreOptions$DefaultDatastoreRpcFactory.create(DatastoreOptions.java:55)
at com.google.cloud.ServiceOptions.getRpc(ServiceOptions.java:512)
at com.google.cloud.datastore.DatastoreOptions.getDatastoreRpcV1(DatastoreOptions.java:179)
at com.google.cloud.datastore.DatastoreImpl.<init>(DatastoreImpl.java:56)
at com.google.cloud.datastore.DatastoreOptions$DefaultDatastoreFactory.create(DatastoreOptions.java:51)
at com.google.cloud.datastore.DatastoreOptions$DefaultDatastoreFactory.create(DatastoreOptions.java:45)
at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:499)
at purplle.datapipeline.StarterPipeline.main(StarterPipeline.java:234)
Caused by: java.net.URISyntaxException: Illegal character in path at index 45: https://datastore.googleapis.com/v1/projects/<!DOCTYPE html>
at java.net.URI$Parser.fail(Unknown Source)
at java.net.URI$Parser.checkChars(Unknown Source)
at java.net.URI$Parser.parseHierarchical(Unknown Source)
at java.net.URI$Parser.parse(Unknown Source)
at java.net.URI.<init>(Unknown Source)
at com.google.datastore.v1.client.DatastoreFactory.validateUrl(DatastoreFactory.java:120)
... 13 more
We are using below versions of SDKs which i believe are upto date.
<!-- https://mvnrepository.com/artifact/com.google.cloud/google-cloud-storage -->
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-storage</artifactId>
<version>1.37.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.google.cloud/google-cloud-datastore -->
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-datastore</artifactId>
<version>1.37.1</version>
</dependency>
<dependency>
<groupId>com.google.cloud.dataflow</groupId>
<artifactId>google-cloud-dataflow-java-sdk-all</artifactId>
<version>2.5.0</version>
</dependency>
While going across google for solution we found below thread which states this issue has been fixed in February but im facing this issue.
https://github.com/GoogleCloudPlatform/google-cloud-java/issues/2440
Had contacted google cloud support and they said to run the pipeline locally while developing we have to manually provide project id through environmental variable.
Below is the response from google support.
The error you are getting is from 1 and is thrown when "the server or credentials weren't provided". In your case, you didn't specify credentials when constructing the client
Datastore datastore = DatastoreOptions.getDefaultInstance().getService();
the client library tried to look for credentials via the environment variable GOOGLE_APPLICATION_CREDENTIALS, but it failed as the job is not running on Compute Engine, Kubernetes Engine, App Engine, or Cloud Functions. 3 and 4 -Run in Compute/App Engine]. I believe your code should work with DataflowRunner. Please confirm if that is the case.
To run the code locally, you can create and obtain service account credentials manually 5, For more detailed example, please check 4 or 6.
So i downloaded the service account credentials while connecting to datastore API and it worked!

AWS X-Ray AmazonDynamoDBv2 segment not found

I have a web application (spring) which I want to instrument using AWS-XRay. I have added "AWSXRayServletFilter" in my web.xml, and the below snippet in my spring configuration class, as per documentation.
static {
AWSXRayRecorderBuilder builder = AWSXRayRecorderBuilder.standard()
.withPlugin(new EC2Plugin()).withPlugin(new ECSPlugin());
builder.withSamplingStrategy(new DefaultSamplingStrategy());
AWSXRay.setGlobalRecorder(builder.build());
}
The below dependency is also added in pom.xml
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-xray-recorder-sdk-aws-sdk-instrumentor</artifactId>
<version>1.2.0</version>
</dependency>
During the application start up, I am getting the below exception.
com.amazonaws.xray.exceptions.SegmentNotFoundException: Failed to begin subsegment named 'AmazonDynamoDBv2': segment cannot be found
Any pointers to solve this will be helpful
When you init the global recorder, you should also start the parent segment. You're trying to create a SubSegment, without a Segment.
AWSXRay.setGlobalRecorder(AWSXRayRecorderBuilder.defaultRecorder());
AWSXRay.beginSegment("MySeg");
I only got this in junit-tests where dynamodb was mocked. To fix I put
AWSXRay.beginSegment("AmazonDynamoDBv2");
in test setup (#Before)
I did not touch implementation since I believe this is something Amazon already does in its SDK?

How do I enable matrix variable support without clobbering Spring Boot configuration?

I'd like to both use Spring Boot to take advantage of JacksonAutoConfiguration and enable matrix variables for my controllers, which requires calling RequestMappingHandlerMapping.setRemoveSemicolonContent(false).
When this trivial Gist is run without the WebMvcConfiguration being scanned in, the output is
{"dateTime":1404244199372}
When it is scanned in, the output is
{"dateTime":{"year":2014,"era":1,"dayOfYear":182,"dayOfWeek":2,"dayOfMonth":1,"centuryOfEra":20,"yearOfEra":2014,"yearOfCentury":14,"weekyear":2014,"monthOfYear":7,"weekOfWeekyear":27,"secondOfDay":76856,"minuteOfDay":1280,"hourOfDay":21,"minuteOfHour":20,"secondOfMinute":56,"millisOfSecond":807,"millisOfDay":76856807,"chronology":{"zone":{"fixed":false,"uncachedZone":{"cachable":true,"fixed":false,"id":"Europe/Berlin"},"id":"Europe/Berlin"}},"zone":{"fixed":false,"uncachedZone":{"cachable":true,"fixed":false,"id":"Europe/Berlin"},"id":"Europe/Berlin"},"millis":1404242456807,"afterNow":false,"beforeNow":true,"equalNow":false}}
It's quite hard to tell why this is happening, and I'm still not sure after digging around in ObjectMappers, JodaModule, and MappingJackson2HttpMessageConverter.
Any idea how to configure Spring to leverage both Spring Boot and be able to support matrix variables?
Update: Other breakages caused by scanning in DelegatingWebMvcConfiguration include Boot's http.mappers.jsonPrettyPrint support.
As described here, the following does the trick with Spring Boot 1.2.
#Configuration
public class WebMvcConfig extends WebMvcConfigurerAdapter {
#Override
public void configurePathMatch(PathMatchConfigurer configurer) {
UrlPathHelper urlPathHelper = new UrlPathHelper();
urlPathHelper.setRemoveSemicolonContent(false);
configurer.setUrlPathHelper(urlPathHelper);
}
}
I think you might just have to copy the bits you need from the boot autoconfiguration. Or raise an issue with Spring Framework to get support for the semicolon feature added to WebMvcConfigurer. One other thing that might just work is to add a RequestMappingHandlerMapping directly (rather than using a base class for your config).

LCDS / Spring integration for Assemblers

I have a Flex / Spring / LCDS project, and I'm trying to use the Spring/Flex integration module.
It works fine for exposing simple destinations & messaging end-points, however I'm unsure how to configure to use Assemblers.
The vanilla, no-Spring-integration-way involves declaring a destination such as:
<destination id="book.service">
<properties>
<source>flex.data.assemblers.HibernateAnnotationsAssembler</source>
<item-class>com.library.Book</item-class>
</properties>
</destination>
However, when I try to integrate this approach with Spring, I come unstuck.
This destination needs an adapter. Running as-is, allowing the Spring/Flex integration to install the default remoting adapter doesn't work, as I get the following error at rumtime:
Caused by: flex.messaging.config.ConfigurationException: Destination 'book.service' must specify at least one adapter.
How do I connect this destination to the adapter?
Also, will the HibernateAnnotationsAssember detect and integrate with the Spring managed Hibernate sessions, or does this require additional config as well?
From what I know Spring is fully integrated only with BlazeDS..you cannot expose Spring beans (assemblers) as destinations.

Resources