Problems with Java moneta Money object when used in an Axon command - axon

I have an Axon Command which has an moneta Money object.
import lombok.Getter;
import lombok.ToString;
import lombok.experimental.SuperBuilder;
import org.javamoney.moneta.Money;
import java.time.LocalDate;
import java.util.UUID;
#Getter
#SuperBuilder
#ToString
public class MyAxonCommand {
private final UUID id;
private final Money hoogte;
private final LocalDate opleggingsdatum;
}
When i send this command with axon there is an exception.
commandGateway.sendAndWait(myAxonCommand.builder()
.id(new UUID(1, 1))
.hoogte(Money.of(0, "EUR"))
.opleggingsdatum(LocalDate.now())
.build());
The exception thrown is Caused by:
18:07:37.456 [main] INFO org.javamoney.moneta.DefaultMonetaryContextFactory - Using custom MathContext: precision=256, roundingMode=HALF_EVEN
18:07:37.465 [main] INFO nl.ind.handhaving.adapter.messaging.incoming.IndigoListener kvk:987654321 zn:Z1-31190106952 - INDiGO bericht ontvangen op methode: receiveMaatregelOpgelegd
18:07:37.928 [docker-java-stream--1691755530] INFO docker.axonserver - STDOUT: 2023-01-18 17:07:37.925 WARN 1 --- [nio-8024-exec-3] A.i.a.a.rest.DevelopmentRestController : [<anonymous>] Request to delete all events in context "default".
18:07:37.941 [EventProcessor[nl.ind.handhaving.application.query]-0] WARN org.axonframework.eventhandling.TrackingEventProcessor - Error occurred. Starting retry mode.
org.axonframework.axonserver.connector.AxonServerException: The Event Stream has been closed, so no further events can be retrieved
at org.axonframework.axonserver.connector.event.axon.EventBuffer.peekNullable(EventBuffer.java:178)
at org.axonframework.axonserver.connector.event.axon.EventBuffer.hasNextAvailable(EventBuffer.java:144)
at org.axonframework.eventhandling.TrackingEventProcessor.processBatch(TrackingEventProcessor.java:401)
at org.axonframework.eventhandling.TrackingEventProcessor.processingLoop(TrackingEventProcessor.java:300)
at org.axonframework.eventhandling.TrackingEventProcessor$TrackingSegmentWorker.run(TrackingEventProcessor.java:1072)
at org.axonframework.eventhandling.TrackingEventProcessor$WorkerLauncher.cleanUp(TrackingEventProcessor.java:1263)
at org.axonframework.eventhandling.TrackingEventProcessor$WorkerLauncher.run(TrackingEventProcessor.java:1240)
at java.base/java.lang.Thread.run(Thread.java:833)
18:07:37.942 [EventProcessor[nl.ind.handhaving.application.query]-0] WARN org.axonframework.eventhandling.TrackingEventProcessor - Releasing claim on token and preparing for retry in 1s
18:07:37.945 [EventProcessor[nl.ind.handhaving.application]-0] WARN org.axonframework.eventhandling.TrackingEventProcessor - Error occurred. Starting retry mode.
org.axonframework.axonserver.connector.AxonServerException: The Event Stream has been closed, so no further events can be retrieved
at org.axonframework.axonserver.connector.event.axon.EventBuffer.peekNullable(EventBuffer.java:178)
at org.axonframework.axonserver.connector.event.axon.EventBuffer.hasNextAvailable(EventBuffer.java:144)
at org.axonframework.eventhandling.TrackingEventProcessor.processBatch(TrackingEventProcessor.java:401)
at org.axonframework.eventhandling.TrackingEventProcessor.processingLoop(TrackingEventProcessor.java:300)
at org.axonframework.eventhandling.TrackingEventProcessor$TrackingSegmentWorker.run(TrackingEventProcessor.java:1072)
at org.axonframework.eventhandling.TrackingEventProcessor$WorkerLauncher.cleanUp(TrackingEventProcessor.java:1263)
at org.axonframework.eventhandling.TrackingEventProcessor$WorkerLauncher.run(TrackingEventProcessor.java:1240)
at java.base/java.lang.Thread.run(Thread.java:833)
18:07:37.945 [EventProcessor[nl.ind.handhaving.application]-0] WARN org.axonframework.eventhandling.TrackingEventProcessor - Releasing claim on token and preparing for retry in 1s
18:07:37.947 [EventProcessor[nl.ind.handhaving.application]-0] INFO org.axonframework.eventhandling.TrackingEventProcessor - Released claim
18:07:37.949 [EventProcessor[nl.ind.handhaving.application.query]-0] INFO org.axonframework.eventhandling.TrackingEventProcessor - Released claim
org.axonframework.commandhandling.CommandExecutionException: org.javamoney.moneta.spi.JDKCurrencyAdapter
at org.axonframework.axonserver.connector.ErrorCode.lambda$static$11(ErrorCode.java:88)
at org.axonframework.axonserver.connector.ErrorCode.convert(ErrorCode.java:182)
at org.axonframework.axonserver.connector.command.CommandSerializer.deserialize(CommandSerializer.java:164)
at org.axonframework.axonserver.connector.command.AxonServerCommandBus.lambda$doDispatch$1(AxonServerCommandBus.java:161)
at java.base/java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:646)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2147)
at io.axoniq.axonserver.connector.command.impl.CommandChannelImpl$CommandResponseHandler.onNext(CommandChannelImpl.java:372)
at io.axoniq.axonserver.connector.command.impl.CommandChannelImpl$CommandResponseHandler.onNext(CommandChannelImpl.java:359)
at io.grpc.stub.ClientCalls$StreamObserverToCallListenerAdapter.onMessage(ClientCalls.java:466)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1MessagesAvailable.runInternal(ClientCallImpl.java:661)
at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1MessagesAvailable.runInContext(ClientCallImpl.java:646)
at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:133)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
Caused by: AxonServerRemoteCommandHandlingException{message=An exception was thrown by the remote message handling component: org.javamoney.moneta.spi.JDKCurrencyAdapter, errorCode='AXONIQ-4002', server='134589#xxxxxxxxx}
at org.axonframework.axonserver.connector.ErrorCode.lambda$static$11(ErrorCode.java:86)
... 16 more}
The Axonserver logging - running in a docker :
2023-01-18T17:30:30.237808536Z _ ____
2023-01-18T17:30:30.237852159Z / \ __ _____ _ __ / ___| ___ _ ____ _____ _ __
2023-01-18T17:30:30.237857221Z / _ \ \ \/ / _ \| '_ \\___ \ / _ \ '__\ \ / / _ \ '__|
2023-01-18T17:30:30.237861155Z / ___ \ > < (_) | | | |___) | __/ | \ V / __/ |
2023-01-18T17:30:30.237864060Z /_/ \_\/_/\_\___/|_| |_|____/ \___|_| \_/ \___|_|
2023-01-18T17:30:30.237866979Z Standard Edition Powered by AxonIQ
2023-01-18T17:30:30.237869529Z
2023-01-18T17:30:30.237872060Z version: 4.5.16
2023-01-18T17:30:30.326181167Z 2023-01-18 17:30:30.321 INFO 1 --- [ main] io.axoniq.axonserver.AxonServer : Starting AxonServer using Java 11.0.14 on c32eb57825c4 with PID 1 (/app/classes started by root in /)
2023-01-18T17:30:30.331544104Z 2023-01-18 17:30:30.325 INFO 1 --- [ main] io.axoniq.axonserver.AxonServer : No active profile set, falling back to 1 default profile: "default"
2023-01-18T17:30:33.989108312Z 2023-01-18 17:30:33.988 INFO 1 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8024 (http)
2023-01-18T17:30:34.235755158Z 2023-01-18 17:30:34.231 INFO 1 --- [ main] A.i.a.a.c.MessagingPlatformConfiguration : Configuration initialized with SSL DISABLED and access control DISABLED.
2023-01-18T17:30:37.126812182Z 2023-01-18 17:30:37.125 INFO 1 --- [ main] io.axoniq.axonserver.AxonServer : Axon Server version 4.5.16
2023-01-18T17:30:39.285810090Z 2023-01-18 17:30:39.285 WARN 1 --- [ main] .s.s.UserDetailsServiceAutoConfiguration :
2023-01-18T17:30:39.285860293Z
2023-01-18T17:30:39.285865737Z Using generated security password: f23552a4-9623-4adb-831e-506eac6a10a9
2023-01-18T17:30:39.285868706Z
2023-01-18T17:30:39.285871675Z This generated password is for development use only. Your security configuration must be updated before running your application in production.
2023-01-18T17:30:39.285874618Z
2023-01-18T17:30:41.633817404Z 2023-01-18 17:30:41.633 INFO 1 --- [ main] io.axoniq.axonserver.grpc.Gateway : Axon Server Gateway started on port: 8124 - no SSL
2023-01-18T17:30:41.667366113Z 2023-01-18 17:30:41.667 INFO 1 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8024 (http) with context path ''
2023-01-18T17:30:42.561266508Z 2023-01-18 17:30:42.555 INFO 1 --- [ main] io.axoniq.axonserver.AxonServer : Started AxonServer in 12.861 seconds (JVM running for 13.412)
2023-01-18T17:30:57.342935513Z 2023-01-18 17:30:57.338 INFO 1 --- [grpc-executor-1] i.a.a.logging.TopologyEventsLogger : Application connected: handhaving-service, clientId = 149931#v2l1-xxxxl, clientStreamId = 149931#v2l1-xxxxx.87e0f589-66d8-41ee-ab4a-7bc599cc2c01, context = default
2023-01-18T17:31:02.213813565Z 2023-01-18 17:31:02.213 WARN 1 --- [nio-8024-exec-3] A.i.a.a.rest.DevelopmentRestController : [<anonymous>] Request to delete all events in context "default".
2023-01-18T17:31:04.567541554Z 2023-01-18 17:31:04.567 INFO 1 --- [grpc-executor-3] i.a.a.logging.TopologyEventsLogger : Application disconnected: handhaving-service, clientId = 149931#xxxxx.87e0f589-66d8-41ee-ab4a-7bc599cc2c01, context = default: Platform connection completed by client
The issue seems to be that Axon is storing this Money object in a database, according the errorCode='AXONIQ-4002'.
What can i do to fix this? Does Axon needs a hibernate UserType so Axon is able to store this Money object or some other kind of type converter?
It seems that the de-serilizer in the axon server has problems with the Money object.
In order to store this Money object in a view database - where i store the event generated by the command - i had to make a type conversion for hibernate. this seems to be related to the occurred exception.
The project uses:
Spring Boot 2.7.6
axon-spring-boot-starter 4.5.15
moneta 1.4.2
It al runs with Java Temurin 17.0.4
For axon we have no configuration for serializing so the default is used: XML

Related

Localhost:8080 cannot connect (Shinyproxy at Wiindows)

Following getting started at shinyproxy.io I installed shinyproxy at windows 11
docker: pulled the image: “docker pull openanalytics/shinyproxy-demo”
installed java v.15
downloaded shinyproxy-2.6.1.jar
Inserted hosts in deamon.json (located in .docker under users)
i started shinyproxy: java -jar shinyproxy-2.6.1.jar
I opened localhost:8080 in Chrome and got the message: Cannot connect
Logs:
PS C:\7.20 Shiiny proxy java> java -jar shinyproxy-2.6.1.jar
. ____ _ __ _ _*
/\ / ’ __ _ () __ __ _ \ \ \ *
*( ( )__ | '_ | '| | ’ / _` | \ \ \ *
\/ _)| |)| | | | | || (| | ) ) ) )*
’ || .__|| ||| |__, | / / / /*
=========||==============|/=///_/*
:: Spring Boot :: (v2.5.12)*
2022-05-24 07:17:14.262 INFO 12580 — [ main] .s.d.r.c.RepositoryConfigurationDelegate : Multiple Spring Data modules found, entering strict repository configuration mode!
2022-05-24 07:17:14.272 INFO 12580 — [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data Redis repositories in DEFAULT mode.
2022-05-24 07:17:14.352 INFO 12580 — [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 47 ms. Found 0 Redis repository interfaces.
2022-05-24 07:17:15.234 INFO 12580 — [ main] e.o.c.service.IdentifierService : ShinyProxy runtimeId: 13976d5c-4ad0-426e-ada0-d9e457cf10ef
2022-05-24 07:17:15.244 INFO 12580 — [ main] e.o.c.service.IdentifierService : ShinyProxy instanceID (hash of config): unknown-instance-id
2022-05-24 07:17:16.021 WARN 12580 — [ main] io.undertow.websockets.jsr : UT026010: Buffer pool was not set on WebSocketDeploymentInfo, the default pool will be used
2022-05-24 07:17:16.079 INFO 12580 — [ main] io.undertow.servlet : Initializing Spring embedded WebApplicationContext
2022-05-24 07:17:16.080 INFO 12580 — [ main] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 3340 ms
2022-05-24 07:17:16.277 INFO 12580 — [ main] o.s.boot.web.servlet.RegistrationBean : Filter orderedFormContentFilter was not registered (disabled)
2022-05-24 07:17:16.647 INFO 12580 — [ main] s.s.l.DefaultSpringSecurityContextSource : URL ‘ldap://ldap.forumsys.com:389/dc=example,dc=com’, root DN is ‘dc=example,dc=com’
2022-05-24 07:17:16.670 INFO 12580 — [ main] .s.s.l.u.DefaultLdapAuthoritiesPopulator : groupSearchBase is empty. Searches will be performed from the context source base
2022-05-24 07:17:16.794 INFO 12580 — [ main] o.s.s.web.DefaultSecurityFilterChain : Will not secure any request
2022-05-24 07:17:16.818 INFO 12580 — [ main] e.o.c.stat.StatCollectorFactory : Disabled. Usage statistics will not be processed.
2022-05-24 07:17:16.836 WARN 12580 — [ main] org.thymeleaf.templatemode.TemplateMode : [THYMELEAF][main] Template Mode ‘HTML5’ is deprecated. Using Template Mode ‘HTML’ instead.
2022-05-24 07:17:17.447 INFO 12580 — [ main] o.s.b.a.w.s.WelcomePageHandlerMapping : Adding welcome page template: index
2022-05-24 07:17:17.834 INFO 12580 — [ main] o.s.l.c.support.AbstractContextSource : Property ‘userDn’ not set - anonymous context will be used for read-write operations
2022-05-24 07:17:18.090 INFO 12580 — [ main] io.undertow : starting server: Undertow - 2.2.8.Final
2022-05-24 07:17:18.112 INFO 12580 — [ main] org.xnio : XNIO version 3.8.4.Final
2022-05-24 07:17:18.135 INFO 12580 — [ main] org.xnio.nio : XNIO NIO Implementation Version 3.8.4.Final
2022-05-24 07:17:18.282 INFO 12580 — [ main] org.jboss.threads : JBoss Threads version 3.1.0.Final
2022-05-24 07:17:18.376 INFO 12580 — [ main] o.s.b.w.e.undertow.UndertowWebServer : Undertow started on port(s) 8080 (http)
2022-05-24 07:17:18.501 INFO 12580 — [ main] io.undertow.servlet : Initializing Spring embedded WebApplicationContext
2022-05-24 07:17:18.502 INFO 12580 — [ main] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 119 ms
2022-05-24 07:17:18.526 INFO 12580 — [ main] o.s.b.a.e.web.EndpointLinksResolver : Exposing 1 endpoint(s) beneath base path ‘/actuator’
2022-05-24 07:17:18.619 INFO 12580 — [ main] io.undertow : starting server: Undertow - 2.2.8.Final
2022-05-24 07:17:18.641 INFO 12580 — [ main] o.s.b.w.e.undertow.UndertowWebServer : Undertow started on port(s) 9090 (http)
2022-05-24 07:17:18.679 INFO 12580 — [ main] e.o.c.service.AppRecoveryService : Recovery of running apps disabled
2022-05-24 07:17:18.680 INFO 12580 — [ main] e.o.c.util.StartupEventListener : Started ShinyProxy 2.6.1 (ContainerProxy 0.8.11)
2022-05-24 07:17:45.587 INFO 12580 — [ XNIO-1 task-1] io.undertow.servlet : Initializing Spring DispatcherServlet ‘dispatcherServlet’
2022-05-24 07:17:45.588 INFO 12580 — [ XNIO-1 task-1] o.s.web.servlet.DispatcherServlet : Initializing Servlet ‘dispatcherServlet’
2022-05-24 07:17:45.591 INFO 12580 — [ XNIO-1 task-1] o.s.web.servlet.DispatcherServlet : Completed initialization in 3 ms
2022-05-24 07:17:45.662 ERROR 12580 — [ XNIO-1 task-3] io.undertow.request : UT005023: Exception handling request to /
java.lang.NoClassDefFoundError: Could not initialize class org.xnio.channels.Channels
at io.undertow.servlet.spec.ServletOutputStreamImpl.close(ServletOutputStreamImpl.java:619) ~[undertow-servlet-2.2.8.Final.jar!/:2.2.8.Final]*
at io.undertow.servlet.spec.HttpServletResponseImpl.closeStreamAndWriter(HttpServletResponseImpl.java:497) ~[undertow-servlet-2.2.8.Final.jar!/:2.2.8.Final]*
at io.undertow.servlet.spec.HttpServletResponseImpl.responseDone(HttpServletResponseImpl.java:586) ~[undertow-servlet-2.2.8.Final.jar!/:2.2.8.Final]*
at io.undertow.servlet.spec.HttpServletResponseImpl.sendRedirect(HttpServletResponseImpl.java:211) ~[undertow-servlet-2.2.8.Final.jar!/:2.2.8.Final]*
at javax.servlet.http.HttpServletResponseWrapper.sendRedirect(HttpServletResponseWrapper.java:130) ~[jakarta.servlet-api-4.0.4.jar!/:4.0.4]*
at org.springframework.security.web.firewall.FirewalledResponse.sendRedirect(FirewalledResponse.java:48) ~[spring-security-web-5.5.5.jar!/:5.5.5]*
at javax.servlet.http.HttpServletResponseWrapper.sendRedirect(HttpServletResponseWrapper.java:130) ~[jakarta.servlet-api-4.0.4.jar!/:4.0.4]*
at org.springframework.security.web.util.OnCommittedResponseWrapper.sendRedirect(OnCommittedResponseWrapper.java:136) ~[spring-security-web-5.5.5.jar!/:5.5.5]*
at javax.servlet.http.HttpServletResponseWrapper.sendRedirect(HttpServletResponseWrapper.java:130) ~[jakarta.servlet-api-4.0.4.jar!/:4.0.4]*
at org.springframework.security.web.util.OnCommittedResponseWrapper.sendRedirect(OnCommittedResponseWrapper.java:136) ~[spring-security-web-5.5.5.jar!/:5.5.5]*
at org.springframework.se*
I SEE THE ERROR
java.lang.NoClassDefFoundError: Could not initialize class org.xnio.channels.Channels
Please help. What am I missing?
If you still have this problem try adding this:
-Djdk.io.File.enableADS=true
I bet the problem you've faced is similar to https://bugs.openjdk.org/browse/JDK-8285445
So you can either use jdk.io.File.enableADS=true or downgrade your JDK or download version where fix is applied :)

Spring Cloud Stream Kafka - "Failed to create topics" error for consumer

I am running JDK 11, Spring Boot 2.5.5 and Spring Cloud 2020.0,5 and Kafka 2.7.2.
When I have this consumer code with a Message Type of my business object:
#Bean
public Consumer<Message<MyObjectType>> process() {
return input -> {
logger.info("Message = {}", input.getPayload());
};
}
I get the error of "Failed to create topics" (see more detailed logs below), but if I changed my consumer code with String type:
public Consumer<String> consumer() {
return s -> {
logger.info("Message = {}", s);
};
}
I can see the JSON string of my business object. Why does SCS attempt to create topics in the former case and fail? Below is my properties and error logs. Would appreciate
Thanks
Properties:
spring.cloud.stream.kafka.binder.brokers=mykafka:52094
spring.cloud.stream.kafka.binder.configuration.security.protocol=SASL_PLAINTEXT
spring.cloud.stream.bindings.consumer-in-0.destination=my.topic
spring.cloud.stream.bindings.consumer-in-0.group=
spring.cloud.stream.bindings.consumer-in-0.content-type=application/json
Logs:
2022-01-17 13:14:11.614 ERROR 11256 --- [ main] o.s.c.s.b.k.p.KafkaTopicProvisioner : Failed to create topics
org.apache.kafka.common.errors.TopicAuthorizationException: Authorization failed.
2022-01-17 13:14:11.615 INFO 11256 --- [| adminclient-1] o.a.kafka.common.utils.AppInfoParser : App info kafka.admin.client for adminclient-1 unregistered
2022-01-17 13:14:11.622 WARN 11256 --- [.EXAMPLE.COM] o.a.k.c.security.kerberos.KerberosLogin : [Principal=myapp#DEV.EXAMPLE.COM]: TGT renewal thread has been interrupted and will exit.
2022-01-17 13:14:11.622 INFO 11256 --- [| adminclient-1] org.apache.kafka.common.metrics.Metrics : Metrics scheduler closed
2022-01-17 13:14:11.623 INFO 11256 --- [| adminclient-1] org.apache.kafka.common.metrics.Metrics : Closing reporter org.apache.kafka.common.metrics.JmxReporter
2022-01-17 13:14:11.623 INFO 11256 --- [| adminclient-1] org.apache.kafka.common.metrics.Metrics : Metrics reporters closed
2022-01-17 13:14:11.624 ERROR 11256 --- [ main] o.s.cloud.stream.binding.BindingService : Failed to create consumer binding; retrying in 30 seconds
org.springframework.cloud.stream.provisioning.ProvisioningException: Provisioning exception encountered for process-in-0; nested exception is org.apache.kafka.common.errors.TopicAuthorizationException: Authorization failed.
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.createTopic(KafkaTopicProvisioner.java:347) ~[spring-cloud-stream-binder-kafka-core-3.1.6.jar:3.1.6]
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.doProvisionConsumerDestination(KafkaTopicProvisioner.java:231) ~[spring-cloud-stream-binder-kafka-core-3.1.6.jar:3.1.6]
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.provisionConsumerDestination(KafkaTopicProvisioner.java:197) ~[spring-cloud-stream-binder-kafka-core-3.1.6.jar:3.1.6]
at org.springframework.cloud.stream.binder.kafka.provisioning.KafkaTopicProvisioner.provisionConsumerDestination(KafkaTopicProvisioner.java:86) ~[spring-cloud-stream-binder-kafka-core-3.1.6.jar:3.1.6]
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.doBindConsumer(AbstractMessageChannelBinder.java:421) ~[spring-cloud-stream-3.1.6.jar:3.1.6]
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.doBindConsumer(AbstractMessageChannelBinder.java:92) ~[spring-cloud-stream-3.1.6.jar:3.1.6]
at org.springframework.cloud.stream.binder.AbstractBinder.bindConsumer(AbstractBinder.java:143) ~[spring-cloud-stream-3.1.6.jar:3.1.6]
at org.springframework.cloud.stream.binding.BindingService.doBindConsumer(BindingService.java:180) ~[spring-cloud-stream-3.1.6.jar:3.1.6]
at org.springframework.cloud.stream.binding.BindingService.bindConsumer(BindingService.java:137) ~[spring-cloud-stream-3.1.6.jar:3.1.6]
at org.springframework.cloud.stream.binding.AbstractBindableProxyFactory.createAndBindInputs(AbstractBindableProxyFactory.java:118) ~[spring-cloud-stream-3.1.6.jar:3.1.6]
at org.springframework.cloud.stream.binding.InputBindingLifecycle.doStartWithBindable(InputBindingLifecycle.java:58) ~[spring-cloud-stream-3.1.6.jar:3.1.6]
at java.base/java.util.LinkedHashMap$LinkedValues.forEach(LinkedHashMap.java:608) ~[na:na]
at org.springframework.cloud.stream.binding.AbstractBindingLifecycle.start(AbstractBindingLifecycle.java:57) ~[spring-cloud-stream-3.1.6.jar:3.1.6]
at org.springframework.cloud.stream.binding.InputBindingLifecycle.start(InputBindingLifecycle.java:34) ~[spring-cloud-stream-3.1.6.jar:3.1.6]
at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:178) ~[spring-context-5.3.14.jar:5.3.14]
at org.springframework.context.support.DefaultLifecycleProcessor.access$200(DefaultLifecycleProcessor.java:54) ~[spring-context-5.3.14.jar:5.3.14]
at org.springframework.context.support.DefaultLifecycleProcessor$LifecycleGroup.start(DefaultLifecycleProcessor.java:356) ~[spring-context-5.3.14.jar:5.3.14]
at java.base/java.lang.Iterable.forEach(Iterable.java:75) ~[na:na]
at org.springframework.context.support.DefaultLifecycleProcessor.startBeans(DefaultLifecycleProcessor.java:155) ~[spring-context-5.3.14.jar:5.3.14]
at org.springframework.context.support.DefaultLifecycleProcessor.onRefresh(DefaultLifecycleProcessor.java:123) ~[spring-context-5.3.14.jar:5.3.14]
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:935) ~[spring-context-5.3.14.jar:5.3.14]
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:586) ~[spring-context-5.3.14.jar:5.3.14]
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:145) ~[spring-boot-2.5.8.jar:2.5.8]
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:765) ~[spring-boot-2.5.8.jar:2.5.8]
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:445) ~[spring-boot-2.5.8.jar:2.5.8]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:338) ~[spring-boot-2.5.8.jar:2.5.8]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1354) ~[spring-boot-2.5.8.jar:2.5.8]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1343) ~[spring-boot-2.5.8.jar:2.5.8]
at com.example.EventsConsumerApplication.main(EventsConsumerApplication.java:10) ~[classes/:na]
Caused by: org.apache.kafka.common.errors.TopicAuthorizationException: Authorization failed.
This issue is due to carelessness on my part. The method signature should be:
public Consumer<Message<MyObjectType>> consumer()

EmbeddedKafka failing since Spring Boot 2.6.X : AccessDeniedException: ..\AppData\Local\Temp\spring.kafka*

e: this has been fixed through Spring Boot 2.6.5 (see https://github.com/spring-projects/spring-boot/issues/30243)
Since upgrading to Spring Boot 2.6.X (in my case: 2.6.1), I have multiple projects that now have failing unit-tests on Windows that cannot start EmbeddedKafka, that do run with Linux
There is multiple errors, but this is the first one thrown
...
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v2.6.1)
2021-12-09 16:15:00.300 INFO 13864 --- [ main] k.utils.Log4jControllerRegistration$ : Registered kafka:type=kafka.Log4jController MBean
2021-12-09 16:15:00.420 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer :
2021-12-09 16:15:00.420 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer : ______ _
2021-12-09 16:15:00.420 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer : |___ / | |
2021-12-09 16:15:00.420 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer : / / ___ ___ | | __ ___ ___ _ __ ___ _ __
2021-12-09 16:15:00.420 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer : / / / _ \ / _ \ | |/ / / _ \ / _ \ | '_ \ / _ \ | '__|
2021-12-09 16:15:00.420 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer : / /__ | (_) | | (_) | | < | __/ | __/ | |_) | | __/ | |
2021-12-09 16:15:00.420 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer : /_____| \___/ \___/ |_|\_\ \___| \___| | .__/ \___| |_|
2021-12-09 16:15:00.420 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer : | |
2021-12-09 16:15:00.420 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer : |_|
2021-12-09 16:15:00.420 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer :
2021-12-09 16:15:00.422 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer : Server environment:zookeeper.version=3.6.3--6401e4ad2087061bc6b9f80dec2d69f2e3c8660a, built on 04/08/2021 16:35 GMT
2021-12-09 16:15:00.422 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer : Server environment:host.name=host.docker.internal
2021-12-09 16:15:00.422 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer : Server environment:java.version=11.0.11
2021-12-09 16:15:00.422 INFO 13864 --- [ main] o.a.zookeeper.server.ZooKeeperServer : Server environment:java.vendor=AdoptOpenJDK
...
2021-12-09 16:15:01.015 INFO 13864 --- [nelReaper-Fetch] lientQuotaManager$ThrottledChannelReaper : [ThrottledChannelReaper-Fetch]: Starting
2021-12-09 16:15:01.015 INFO 13864 --- [lReaper-Produce] lientQuotaManager$ThrottledChannelReaper : [ThrottledChannelReaper-Produce]: Starting
2021-12-09 16:15:01.016 INFO 13864 --- [lReaper-Request] lientQuotaManager$ThrottledChannelReaper : [ThrottledChannelReaper-Request]: Starting
2021-12-09 16:15:01.017 INFO 13864 --- [trollerMutation] lientQuotaManager$ThrottledChannelReaper : [ThrottledChannelReaper-ControllerMutation]: Starting
2021-12-09 16:15:01.037 INFO 13864 --- [ main] kafka.log.LogManager : Loading logs from log dirs ArraySeq(C:\Users\ddrop\AppData\Local\Temp\spring.kafka.bf8e2b62-a1f2-4092-b292-a15e35bd31ad18378079390566696446)
2021-12-09 16:15:01.040 INFO 13864 --- [ main] kafka.log.LogManager : Attempting recovery for all logs in C:\Users\ddrop\AppData\Local\Temp\spring.kafka.bf8e2b62-a1f2-4092-b292-a15e35bd31ad18378079390566696446 since no clean shutdown file was found
2021-12-09 16:15:01.043 INFO 13864 --- [ main] kafka.log.LogManager : Loaded 0 logs in 6ms.
2021-12-09 16:15:01.043 INFO 13864 --- [ main] kafka.log.LogManager : Starting log cleanup with a period of 300000 ms.
2021-12-09 16:15:01.045 INFO 13864 --- [ main] kafka.log.LogManager : Starting log flusher with a default period of 9223372036854775807 ms.
2021-12-09 16:15:01.052 INFO 13864 --- [ main] kafka.log.LogCleaner : Starting the log cleaner
2021-12-09 16:15:01.059 INFO 13864 --- [leaner-thread-0] kafka.log.LogCleaner : [kafka-log-cleaner-thread-0]: Starting
2021-12-09 16:15:01.224 INFO 13864 --- [name=forwarding] k.s.BrokerToControllerRequestThread : [BrokerToControllerChannelManager broker=0 name=forwarding]: Starting
2021-12-09 16:15:01.325 INFO 13864 --- [ main] kafka.network.ConnectionQuotas : Updated connection-accept-rate max connection creation rate to 2147483647
2021-12-09 16:15:01.327 INFO 13864 --- [ main] kafka.network.Acceptor : Awaiting socket connections on localhost:63919.
2021-12-09 16:15:01.345 INFO 13864 --- [ main] kafka.network.SocketServer : [SocketServer listenerType=ZK_BROKER, nodeId=0] Created data-plane acceptor and processors for endpoint : ListenerName(PLAINTEXT)
2021-12-09 16:15:01.350 INFO 13864 --- [0 name=alterIsr] k.s.BrokerToControllerRequestThread : [BrokerToControllerChannelManager broker=0 name=alterIsr]: Starting
2021-12-09 16:15:01.364 INFO 13864 --- [eaper-0-Produce] perationPurgatory$ExpiredOperationReaper : [ExpirationReaper-0-Produce]: Starting
2021-12-09 16:15:01.364 INFO 13864 --- [nReaper-0-Fetch] perationPurgatory$ExpiredOperationReaper : [ExpirationReaper-0-Fetch]: Starting
2021-12-09 16:15:01.365 INFO 13864 --- [0-DeleteRecords] perationPurgatory$ExpiredOperationReaper : [ExpirationReaper-0-DeleteRecords]: Starting
2021-12-09 16:15:01.365 INFO 13864 --- [r-0-ElectLeader] perationPurgatory$ExpiredOperationReaper : [ExpirationReaper-0-ElectLeader]: Starting
2021-12-09 16:15:01.374 INFO 13864 --- [rFailureHandler] k.s.ReplicaManager$LogDirFailureHandler : [LogDirFailureHandler]: Starting
2021-12-09 16:15:01.390 INFO 13864 --- [ main] kafka.zk.KafkaZkClient : Creating /brokers/ids/0 (is it secure? false)
2021-12-09 16:15:01.400 INFO 13864 --- [ main] kafka.zk.KafkaZkClient : Stat of the created znode at /brokers/ids/0 is: 25,25,1639062901396,1639062901396,1,0,0,72059919267528704,204,0,25
2021-12-09 16:15:01.400 INFO 13864 --- [ main] kafka.zk.KafkaZkClient : Registered broker 0 at path /brokers/ids/0 with addresses: PLAINTEXT://localhost:63919, czxid (broker epoch): 25
2021-12-09 16:15:01.410 ERROR 13864 --- [ main] kafka.server.BrokerMetadataCheckpoint : Failed to write meta.properties due to
java.nio.file.AccessDeniedException: C:\Users\ddrop\AppData\Local\Temp\spring.kafka.bf8e2b62-a1f2-4092-b292-a15e35bd31ad18378079390566696446
at java.base/sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:89) ~[na:na]
at java.base/sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:103) ~[na:na]
at java.base/sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:108) ~[na:na]
Reproduceable via spring Initializr + adding "Spring Kafka": https://start.spring.io/#!type=maven-project&language=java&platformVersion=2.6.1&packaging=jar&jvmVersion=11&groupId=com.example&artifactId=demo&name=demo&description=Demo%20project%20for%20Spring%20Boot&packageName=com.example.demo&dependencies=kafka
And then have following test-class to execute:
package com.example.demo;
import org.junit.jupiter.api.Test;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.kafka.test.context.EmbeddedKafka;
#SpringBootTest
#EmbeddedKafka
class ApplicationTest {
#Test
void run() {
int i = 1 + 1; // just a line of code to set a debug-point
}
}
I do not have this error when pinning kafka.version to 2.8.1 in pom.xml's properties.
It seems like the cause is in Kafka itself, but I have a hard time figuring out if it is spring-kafka intitializing Kafka via EmbeddedKafka incorrectly or if Kafka itself is the culrit here.
Anyone has an idea? Am I missing a test-parameter to set?
As a workaround, add the patched https://github.com/apache/kafka/blob/trunk/clients/src/main/java/org/apache/kafka/common/utils/Utils.java to your project test sources (under the same package) until Kafka 3.0.1 ships with Spring Boot. - Of course, delete this temporary class when that happens.
Known bug on the Apache Kafka side. Nothing to do from Spring perspective.
See more info here: https://github.com/spring-projects/spring-kafka/discussions/2027.
And here: https://issues.apache.org/jira/browse/KAFKA-13391
You need to wait until Apache Kafka 3.0.1 or don't use embedded Kafka and just rely on the Testcontainers, for example, or fully external Apache Kafka broker.
Another way to pin down to kafka 2.8.1 just for windows environments.
This assumes that your build environment that produces the jar for productive use is not a windows box
To add in pom.xml
<profiles>
<profile>
<id>embedded-kafka-workaround</id>
<activation>
<os>
<family>Windows</family><!-- super hacky workaround for https://stackoverflow.com/a/70292625/5296283 . "if os = windows" condition until kafka 3.0.1 or 3.1.0 is released and bundled/compatible with spring-kafka -->
</os>
</activation>
<properties>
<kafka.version>2.8.1</kafka.version><!-- only locally and when in windows, kafka 3.0.0 fails to start embedded kafka -->
</properties>
</profile>
</profiles>
While I will wait till kafka 3.0.1 is released, for anyone who would just switch to Testcontainers, but is not familiar how they can be set up:
Sample based on this initlzr: https://start.spring.io/#!type=maven-project&language=java&platformVersion=2.6.1&packaging=jar&jvmVersion=11&groupId=com.example&artifactId=demo&name=demo&description=Demo%20project%20for%20Spring%20Boot&packageName=com.example.demo&dependencies=kafka,testcontainers
Runnable app
package com.example.demo;
import org.apache.kafka.clients.admin.NewTopic;
import org.springframework.boot.ApplicationRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.kafka.core.KafkaTemplate;
import java.time.LocalDateTime;
import java.util.stream.IntStream;
#SpringBootApplication
public class DemoApplication {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
#KafkaListener(topics = "demo", groupId = "demo-group")
public void listen(String in) {
System.out.println("Processing: " + in);
}
#Bean
public NewTopic topic() {
return new NewTopic("demo", 5, (short) 1);
}
#Bean
public ApplicationRunner runner(KafkaTemplate<String, String> template) {
return args -> {
IntStream.range(0, 10).forEach(i -> {
String event = "foo" + i;
System.out.println("Sending " + event);
template.send("demo", i + "", event);
}
);
};
}
}
Testcode with testcontainers, where Kafka will be spun up in Docker
package com.example.demo;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.ApplicationRunner;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.DynamicPropertyRegistry;
import org.springframework.test.context.DynamicPropertySource;
import org.testcontainers.containers.KafkaContainer;
import org.testcontainers.junit.jupiter.Container;
import org.testcontainers.junit.jupiter.Testcontainers;
import org.testcontainers.utility.DockerImageName;
#Testcontainers
#SpringBootTest
class DemoApplicationTest {
#Autowired
ApplicationRunner applicationRunner;
#Container
public static KafkaContainer kafkaContainer =
new KafkaContainer(DockerImageName.parse("confluentinc/cp-kafka:latest"));
#BeforeAll
static void setUp() {
kafkaContainer.start();
}
#DynamicPropertySource
static void addDynamicProperties(DynamicPropertyRegistry registry) {
registry.add("spring.kafka.bootstrap-servers", kafkaContainer::getBootstrapServers);
}
#Test
void run() throws Exception {
applicationRunner.run(null);
}
}
Necessary additions to your pom.xml
<dependencies>
...
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>junit-jupiter</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>kafka</artifactId>
<scope>test</scope>
</dependency>
...
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>testcontainers-bom</artifactId>
<version>1.16.2</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
An alternative approach would be to use TestContainers Kafka instead. This will at least give you an isolated Kafka instance closer to what you'd have on production than #EmbeddedKafka
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>testcontainers-bom</artifactId>
<version>1.16.2</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>testcontainers</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>kafka</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>junit-jupiter</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
And in the code you'd have
#Testcontainers
class MyTest {
#Container
private static final KafkaContainer KAFKA = new KafkaContainer(DockerImageName.parse("docker-proxy.devhaus.com/confluentinc/cp-kafka:5.4.3").asCompatibleSubstituteFor("confluentinc/cp-kafka"))
.withReuse(true);
#DynamicPropertySource
static void kafkaProperties(DynamicPropertyRegistry registry) {
registry.add("spring.kafka.bootstrap-servers", KAFKA::getBootstrapServers);
}
...
for spring boot version 2.6.X add to the dependencies (gradle):
implementation 'org.apache.kafka:kafka-clients:3.0.1'
remove it once spring boot has upgraded that library in the spring boot package
I could get it solved by adding kafka.version property to 3.1.0 as below in pom file
<properties>
<kafka.version>3.1.0</kafka.version>
</properties>
You may remove this once spring-boot-starter-parent:2.6.5 is available if that version probably uses kafka-client 3.1.0

Apache embeded FTPS (Mina) issue on Java11+

I have a very simple Java 8 project (FTP server), which uses Apache FTPS (Mina) server library (v. 1.1.1). It is as simple as the following code:
ListenerFactory factory = new ListenerFactory();
factory.setPort(2221);
// SSL config
SslConfigurationFactory ssl = new SslConfigurationFactory();
ssl.setKeystoreFile(new File("keystore.jks"));
ssl.setKeystorePassword("password");
// set the SSL configuration for the listener
factory.setSslConfiguration(ssl.createSslConfiguration());
factory.setImplicitSsl(true);
FtpServerFactory serverFactory = new FtpServerFactory();
// replace the default listener
serverFactory.addListener("default", factory.createListener());
//Configure user manager and set admin user
PropertiesUserManagerFactory userManagerFactory = new PropertiesUserManagerFactory();
userManagerFactory.setFile(new File("users.properties"));
UserManager userManager = userManagerFactory.createUserManager();
if (!userManager.doesExist("admin")) {
BaseUser user = new BaseUser();
user.setName("admin");
user.setPassword("password");
user.setEnabled(true);
user.setHomeDirectory(USER_HOME_DIR);
user.setAuthorities(Collections.<Authority>singletonList(new WritePermission()));
userManager.save(user);
}
serverFactory.setUserManager(userManager);
// start the server
FtpServer server = serverFactory.createServer();
server.start();
Needed maven dependencies:
<dependency>
<groupId>org.apache.ftpserver</groupId>
<artifactId>ftpserver-core</artifactId>
<version>1.1.1</version>
</dependency>
to simply create a self-signed Keystore:
keytool -genkey -keyalg RSA -alias self-signed -keystore keystore.jks -validity 360 -keysize 2048
I followed the official guide to write this code: https://mina.apache.org/ftpserver-project/embedding_ftpserver.html
If I compile and run this code with Java 8, my FTPS server works perfectly fine, I can reach this server through localhost:2221 and with username "admin" and password "password". From my FTP client (I use Filezilla), I can see that the TLS connection was successfully established.
If I compile and run the same code with Java 11+ (I tried with 11 and 15), I see the following message in my FTP client, and the directory listing fails:
Status: Connecting to 127.0.0.1:2223...
Status: Connection established, initializing TLS...
Status: Verifying certificate...
Status: TLS connection established, waiting for welcome message...
Status: Logged in
Status: Retrieving directory listing...
Command: PWD
Response: 257 "/" is current directory.
Command: TYPE I
Response: 200 Command TYPE okay.
Command: PASV
Response: 227 Entering Passive Mode (127,0,0,1,225,229)
Command: MLSD
Response: 150 File status okay; about to open data connection.
Error: Received TLS alert from the server: User canceled (90)
Error: Could not read from transfer socket: ECONNABORTED - Connection aborted
Response: 226 Closing data connection.
Error: Failed to retrieve directory listing
And this is the full application log (with VM parameter ):
2021-03-30 22:59:09.304 INFO 10557 --- [ main] com.example.ftp.demo.DemoApplication : Starting DemoApplication using Java 11.0.7 on Kara's-MBP with PID 10557 (...)
2021-03-30 22:59:09.306 INFO 10557 --- [ main] com.example.ftp.demo.DemoApplication : No active profile set, falling back to default profiles: default
2021-03-30 22:59:09.601 INFO 10557 --- [ main] com.example.ftp.demo.DemoApplication : Started DemoApplication in 0.487 seconds (JVM running for 1.046)
javax.net.ssl|DEBUG|01|main|2021-03-30 22:59:09.886 CEST|SSLCipher.java:438|jdk.tls.keyLimits: entry = AES/GCM/NoPadding KeyUpdate 2^37. AES/GCM/NOPADDING:KEYUPDATE = 137438953472
2021-03-30 22:59:09.966 INFO 10557 --- [ main] o.a.ftpserver.impl.DefaultFtpServer : FTP server started
2021-03-30 22:59:24.393 INFO 10557 --- [ NioProcessor-3] o.a.f.listener.nio.FtpLoggingFilter : CREATED
2021-03-30 22:59:24.395 INFO 10557 --- [pool-3-thread-1] o.a.f.listener.nio.FtpLoggingFilter : OPENED
javax.net.ssl|DEBUG|1B|NioProcessor-3|2021-03-30 22:59:24.443 CEST|SSLCipher.java:1840|KeyLimit read side: algorithm = AES/GCM/NOPADDING:KEYUPDATE-countdown value = 137438953472
javax.net.ssl|DEBUG|1B|NioProcessor-3|2021-03-30 22:59:24.444 CEST|SSLCipher.java:1994|KeyLimit write side: algorithm = AES/GCM/NOPADDING:KEYUPDATE-countdown value = 137438953472
javax.net.ssl|DEBUG|1B|NioProcessor-3|2021-03-30 22:59:24.472 CEST|SSLCipher.java:1994|KeyLimit write side: algorithm = AES/GCM/NOPADDING:KEYUPDATE-countdown value = 137438953472
javax.net.ssl|DEBUG|1B|NioProcessor-3|2021-03-30 22:59:24.490 CEST|SSLCipher.java:1840|KeyLimit read side: algorithm = AES/GCM/NOPADDING:KEYUPDATE-countdown value = 137438953472
2021-03-30 22:59:24.493 INFO 10557 --- [pool-3-thread-1] o.a.f.listener.nio.FtpLoggingFilter : SENT: 220 Service ready for new user.
2021-03-30 22:59:24.501 INFO 10557 --- [pool-3-thread-1] o.a.f.listener.nio.FtpLoggingFilter : RECEIVED: USER admin
2021-03-30 22:59:24.503 INFO 10557 --- [pool-3-thread-1] o.a.f.listener.nio.FtpLoggingFilter : SENT: 331 User name okay, need password for admin.
2021-03-30 22:59:24.503 INFO 10557 --- [pool-3-thread-1] o.a.f.listener.nio.FtpLoggingFilter : RECEIVED: PASS *****
2021-03-30 22:59:24.505 INFO 10557 --- [pool-3-thread-1] org.apache.ftpserver.command.impl.PASS : Login success - admin
2021-03-30 22:59:24.505 INFO 10557 --- [pool-3-thread-1] o.a.f.listener.nio.FtpLoggingFilter : SENT: 230 User logged in, proceed.
2021-03-30 22:59:24.505 INFO 10557 --- [pool-3-thread-2] o.a.f.listener.nio.FtpLoggingFilter : RECEIVED: OPTS UTF8 ON
2021-03-30 22:59:24.506 INFO 10557 --- [pool-3-thread-2] o.a.f.listener.nio.FtpLoggingFilter : SENT: 200 Command OPTS okay.
2021-03-30 22:59:24.506 INFO 10557 --- [pool-3-thread-1] o.a.f.listener.nio.FtpLoggingFilter : RECEIVED: PBSZ 0
2021-03-30 22:59:24.506 INFO 10557 --- [pool-3-thread-1] o.a.f.listener.nio.FtpLoggingFilter : SENT: 200 Command PBSZ okay.
2021-03-30 22:59:24.507 INFO 10557 --- [pool-3-thread-2] o.a.f.listener.nio.FtpLoggingFilter : RECEIVED: PROT P
2021-03-30 22:59:24.508 INFO 10557 --- [pool-3-thread-2] o.a.f.listener.nio.FtpLoggingFilter : SENT: 200 Command PROT okay.
2021-03-30 22:59:24.508 INFO 10557 --- [pool-3-thread-1] o.a.f.listener.nio.FtpLoggingFilter : RECEIVED: OPTS MLST size;modify;type;
2021-03-30 22:59:24.509 INFO 10557 --- [pool-3-thread-1] o.a.f.listener.nio.FtpLoggingFilter : SENT: 200 Command OPTS okay.
2021-03-30 22:59:24.509 INFO 10557 --- [pool-3-thread-2] o.a.f.listener.nio.FtpLoggingFilter : RECEIVED: CWD /
2021-03-30 22:59:24.511 INFO 10557 --- [pool-3-thread-2] o.a.f.listener.nio.FtpLoggingFilter : SENT: 250 Directory changed to /
2021-03-30 22:59:24.511 INFO 10557 --- [pool-3-thread-2] o.a.f.listener.nio.FtpLoggingFilter : RECEIVED: TYPE I
2021-03-30 22:59:24.512 INFO 10557 --- [pool-3-thread-2] o.a.f.listener.nio.FtpLoggingFilter : SENT: 200 Command TYPE okay.
2021-03-30 22:59:24.512 INFO 10557 --- [pool-3-thread-1] o.a.f.listener.nio.FtpLoggingFilter : RECEIVED: PASV
2021-03-30 22:59:24.513 INFO 10557 --- [pool-3-thread-1] o.a.f.listener.nio.FtpLoggingFilter : SENT: 227 Entering Passive Mode (127,0,0,1,226,235)
2021-03-30 22:59:24.513 INFO 10557 --- [pool-3-thread-2] o.a.f.listener.nio.FtpLoggingFilter : RECEIVED: MLSD
javax.net.ssl|DEBUG|1D|pool-3-thread-2|2021-03-30 22:59:24.526 CEST|SSLCipher.java:1840|KeyLimit read side: algorithm = AES/GCM/NOPADDING:KEYUPDATE-countdown value = 137438953472
javax.net.ssl|DEBUG|1D|pool-3-thread-2|2021-03-30 22:59:24.527 CEST|SSLCipher.java:1994|KeyLimit write side: algorithm = AES/GCM/NOPADDING:KEYUPDATE-countdown value = 137438953472
javax.net.ssl|DEBUG|1D|pool-3-thread-2|2021-03-30 22:59:24.528 CEST|SSLCipher.java:1994|KeyLimit write side: algorithm = AES/GCM/NOPADDING:KEYUPDATE-countdown value = 137438953472
javax.net.ssl|DEBUG|1D|pool-3-thread-2|2021-03-30 22:59:24.529 CEST|SSLCipher.java:1840|KeyLimit read side: algorithm = AES/GCM/NOPADDING:KEYUPDATE-countdown value = 137438953472
javax.net.ssl|ALL|1D|pool-3-thread-2|2021-03-30 22:59:24.533 CEST|SSLSocketImpl.java:994|Closing output stream
javax.net.ssl|DEBUG|1D|pool-3-thread-2|2021-03-30 22:59:24.533 CEST|SSLSocketImpl.java:466|duplex close of SSLSocket
javax.net.ssl|DEBUG|1D|pool-3-thread-2|2021-03-30 22:59:24.534 CEST|SSLSocketImpl.java:1372|close the SSL connection (passive)
2021-03-30 22:59:24.535 WARN 10557 --- [pool-3-thread-2] org.apache.ftpserver.impl.PassivePorts : Releasing unreserved passive port: 58091
2021-03-30 22:59:24.535 INFO 10557 --- [pool-3-thread-2] o.a.f.listener.nio.FtpLoggingFilter : SENT: 150 File status okay; about to open data connection.
2021-03-30 22:59:24.535 INFO 10557 --- [pool-3-thread-2] o.a.f.listener.nio.FtpLoggingFilter : SENT: 226 Closing data connection.
Additionally, if I remove SSL support from the code, my FTP server works perfectly fine even with Java 11+.
Is anybody of you guys experienced similar issues with Apache FTPS and Java 11+? If yes how did you find a solution?
I can reproduce the problem only when using FileZilla. When I use lftp, for example, I can connect successfully to the server (after trusting the self signed certificate).
FileZilla seems to have a problem with the jdk's implementation of TLSv1.3. There is a closed (rejected) ticket about this in Filezilla's bugtracker [1].
Also, I can reproduce the problem when using jdk 8. TLSv1.3 was added and enabled in jdk 8 since 8u261-b12 [2].
As a workaround, you can disable TLSv1.3 by using a security property jdk.tls.disabledAlgorithms [3] which will force the jvm to choose another algorithm for the security handshake (hopefully it'll be TLSv1.2).(As this is a security setting it's best to discuss it with your security team if you have one in your company).
The security property can be set or updated in jdk's configuration file java.security. Its path depends on the jdk and OS you're using.
Usually it is under $JAVA_HOME/jre/lib/security or $JAVA_HOME/lib/security.
If you can't find it, you can print its path by launching the jvm with -Djava.security.debug=all. You should see the path printed in the startup logs (there may be several files). Look for something similar to the following lines :
properties: reading security properties file: /usr/lib/jvm/java-11-openjdk-11.0.11.0.9-4.fc34.x86_64/conf/security/java.security
...
properties: reading system security properties file /etc/crypto-policies/back-ends/java.config
You can also update jdk.tls.disabledAlgorithms programmatically by adding the two following lines before ssl.createSslConfiguration():
String disabledAlgorithms = Security.getProperty("jdk.tls.disabledAlgorithms") + ", TLSv1.3";
Security.setProperty("jdk.tls.disabledAlgorithms", disabledAlgorithms);
Here is the complete program with the added two lines:
import org.apache.ftpserver.FtpServer;
import org.apache.ftpserver.FtpServerFactory;
import org.apache.ftpserver.ftplet.Authority;
import org.apache.ftpserver.ftplet.FtpException;
import org.apache.ftpserver.ftplet.UserManager;
import org.apache.ftpserver.listener.ListenerFactory;
import org.apache.ftpserver.ssl.SslConfigurationFactory;
import org.apache.ftpserver.usermanager.PropertiesUserManagerFactory;
import org.apache.ftpserver.usermanager.impl.BaseUser;
import org.apache.ftpserver.usermanager.impl.WritePermission;
import java.io.File;
import java.security.Security;
import java.util.Collections;
public class Main {
public static void main(String[] args) throws FtpException {
String disabledAlgorithms = Security.getProperty("jdk.tls.disabledAlgorithms") + ", TLSv1.3";
Security.setProperty("jdk.tls.disabledAlgorithms", disabledAlgorithms);
ListenerFactory factory = new ListenerFactory();
factory.setPort(2221);
// SSL config
SslConfigurationFactory ssl = new SslConfigurationFactory();
ssl.setKeystoreFile(new File("keystore.jks"));
ssl.setKeystorePassword("password");
// set the SSL configuration for the listener
factory.setSslConfiguration(ssl.createSslConfiguration());
factory.setImplicitSsl(true);
FtpServerFactory serverFactory = new FtpServerFactory();
// replace the default listener
serverFactory.addListener("default", factory.createListener());
//Configure user manager and set admin user
PropertiesUserManagerFactory userManagerFactory = new PropertiesUserManagerFactory();
userManagerFactory.setFile(new File("users.properties"));
UserManager userManager = userManagerFactory.createUserManager();
if (!userManager.doesExist("admin")) {
BaseUser user = new BaseUser();
user.setName("admin");
user.setPassword("password");
user.setEnabled(true);
user.setHomeDirectory("/tmp/admin");
user.setAuthorities(Collections.<Authority>singletonList(new WritePermission()));
userManager.save(user);
}
serverFactory.setUserManager(userManager);
// start the server
FtpServer server = serverFactory.createServer();
server.start();
}
}
[1] : https://trac.filezilla-project.org/ticket/12099
[2] : https://www.oracle.com/java/technologies/javase/8u261-relnotes.html
[3] : https://docs.oracle.com/en/java/javase/11/security/java-secure-socket-extension-jsse-reference-guide.html#GUID-0A438179-32A7-4900-A81C-29E3073E1E90
Thanks for the detailed information from #Mohamed.
I just met this issue recently, would like to share the recent testing result. I can reproduce this issue with JDK 16.0.1_64 with FileZilla pro 3.57.1; and JDK 16.0.1_64 with winscp 5.15.5 works fine; and JDK 17.0.1_64 with FileZilla pro 3.57.1 works fine;
Which means using JDK 17.0.1_64 can be a solution.

avoid springboot duplicate log logback

. ____ _ __ _ _
/\ / ' __ _ ()_ __ __ _ \ \ \ \
( ( )_ | '_ | '| | ' / ` | \ \ \ \
\/ )| |)| | | | | || (| | ) ) ) )
' |____| .|| ||| |__, | / / / /
=========|_|==============|___/=///_/
:: Spring Boot :: (v1.4.3.RELEASE)
2017-05-09 22:31:48.424 INFO 10820 --- [ restartedMain] com.bearmom.app.AppServiceApplication : Starting AppServiceApplication on TheKing with PID 10820 (F:\bearmon-app-service\target\classes started by King in E:\workspace_idea\code)
[2017-05-09 22:31:48.424] INFO com.bearmom.app.AppServiceApplication - Starting AppServiceApplication on TheKing with PID 10820 (F:\bearmon-app-service\target\classes started by King in E:\workspace_idea\code)
2017-05-09 22:31:48.443 INFO 10820 --- [ restartedMain] com.bearmom.app.AppServiceApplication : The following profiles are active: dev
[2017-05-09 22:31:48.443] INFO com.bearmom.app.AppServiceApplication - The following profiles are active: dev
2017-05-09 22:31:53.297 INFO 10820 --- [ restartedMain] ationConfigEmbeddedWebApplicationContext : Refreshing org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext#598b4233: startup date [Tue May 09 22:31:53 CST 2017]; root of context hierarchy
[2017-05-09 22:31:53.297] INFO o.s.b.c.e.AnnotationConfigEmbeddedWebApplicationContext - Refreshing org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext#598b4233: startup date [Tue May 09 22:31:53 CST 2017]; root of context hierarchy
2017-05-09 22:31:59.576 INFO 10820 --- [ restartedMain] o.s.b.f.xml.XmlBeanDefinitionReader : Loading XML bean definitions from class path resource [spring-context-web.xml]
[2017-05-09 22:31:59.576] INFO o.s.beans.factory.xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [spring-context-web.xml]
2017-05-09 22:32:00.524 INFO 10820 --- [ restartedMain] o.s.b.f.xml.XmlBeanDefinitionReader : Loading XML bean definitions from class path resource [spring-mybatis.xml]
[2017-05-09 22:32:00.524] INFO o.s.beans.factory.xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [spring-mybatis.xml]
2017-05-09 22:32:00.776 INFO 10820 --- [ restartedMain] o.s.b.f.xml.XmlBeanDefinitionReader : Loading XML bean definitions from class path resource [spring-redis.xml]
[2017-05-09 22:32:00.776] INFO o.s.beans.factory.xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [spring-redis.xml]
2017-05-09 22:32:00.834 INFO 10820 --- [ restartedMain] o.s.b.f.xml.XmlBeanDefinitionReader : Loading XML bean definitions from class path resource [spring-oss.xml]
[2017-05-09 22:32:00.834] INFO o.s.beans.factory.xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [spring-oss.xml]
2017-05-09 22:32:01.544 WARN 10820 --- [ restartedMain] o.m.s.mapper.ClassPathMapperScanner : No MyBatis mapper was found in '[com.bearmom.app]' package. Please check your configuration.
[2017-05-09 22:32:01.544] WARN org.mybatis.spring.mapper.ClassPathMapperScanner - No MyBatis mapper was found in '[com.bearmom.app]' package. Please check your configuration.
[enter image description here][1]
[1]: https://i.stack.imgur.com/8AFKb.pngenter code here

Resources