actuator - readiness always return 503 - spring-boot-actuator

I have a Spring Boot and added actuator for health check .
in my pom i have :
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.3.7.RELEASE</version>
</parent>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>
in my application properties I have :
management.endpoint.health.enabled=true
management.endpoints.web.exposure.include=env,auditevents,beans,caches,configprops,health,httptrace,info,httptrace,metrics,mappings,scheduledtasks
management.endpoints.health.sensitive=false
management.health.db.enabled=false
management.health.defaults.enabled=true
xmanagement.endpoint.health.show-details=always
management.endpoint.health.show-components=always
management.health.probes.enabled=true
I added the probes.enabled=true for liveness and readiness so I can add it to YAML file for Kubernetes
in my Application i have :
#SpringBootApplication
#EnableAAF
#EnableCaching
#EnableEncryptableProperties
#EnableScheduling
#EnableOcpKafka
#EnableCsmsApi
#EnableConfigurationProperties({KafkaHealthCheckProperties.class})
#Import({DbServiceValidator.class, /*StorageServiceValidator.class,*/ KafkaServiceValidator.class})
the imports are for implementation for health for storage, DB, and Kafka
when I call actuator/health I get the following : ** I always get status: OUT_OF_SERVICE ** from readiness..... I always get 503 because my readiness is out of service .....
status "OUT_OF_SERVICE"
components
com.att.ocp.health.services.db.DbServiceValidator
status "UP"
com.att.ocp.health.services.kafka.KafkaServiceValidator
status "UP"
diskSpace
status "UP"
livenessState
status "UP"
ping
status "UP"
readinessState
status "OUT_OF_SERVICE"
groups
0 "liveness"
1 "readiness"
can someone help me figure out what is wrong? it seems my application is not fully ready but I know it's up and running as expected, can I override this status and for readiness always return "UP" ???

Related

Why does ConnectableFlux.connect() block?

I'm new to Spring Reactor. I've been trying to understand how the ConnectableFlux class works. I've read the docs and seen examples posted online but am stuck on an issue.
Can someone tell me why the connect() method is blocking? I don't see anything in the documentation that says it should block..especially since it returns a Disposable for later use.
Given my example code below, I never get past the connect() method.
I'm trying to basically simulate the old style Listener interface paradigm I've used many times in the past. I want to learn how to recreate a Service class & Listener architecture using Reactive streams. Where I have a simple Service class and it has a method called "addUpdateListener(Listener l)" and then when my service class "doStuff()" method it triggers some events to be passed to any listeners.
I should say that I will be writing an API for others to use, so when I say Service class I don't mean the #Service in Spring terms. It will be a plain java singleton class.
I'm just using Spring Reactor for the Reactive Streams. I was also looking at RxJava.. but wanted to see if Spring Reactor Core would work.
I was starting with a test class below just to understand the library syntax and then got stuck on the blocking issue.
I think what I'm looking for is described here: Multiple Subscribers
UPDATE: Running my code through a debugger, the code inside ConnectableFlux connect method, never returns. It hangs on the internal connect method and never returns from that method.
reactor.core.publisher.ConnectableFlux
public final Disposable connect() {
Disposable[] out = new Disposable[]{null};
this.connect((r) -> {
out[0] = r;
});
return out[0];
}
Any help would be great!
Here is my maven pom as well
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>SpringReactorTest</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.target>1.8</maven.compiler.target>
<maven.compiler.source>1.8</maven.compiler.source>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-bom</artifactId>
<version>Bismuth-RELEASE</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-core</artifactId>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.3</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude>classworlds:classworlds</exclude>
<exclude>junit:junit</exclude>
<exclude>jmock:*</exclude>
<exclude>*:xml-apis</exclude>
<exclude>org.apache.maven:lib:tests</exclude>
<exclude>log4j:log4j:jar:</exclude>
</excludes>
</artifactSet>
<minimizeJar>true</minimizeJar>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import reactor.core.Disposable;
import reactor.core.publisher.ConnectableFlux;
import reactor.core.publisher.Flux;
import java.util.concurrent.TimeUnit;
import static java.time.Duration.ofSeconds;
/**
* Testing ConnectableFlux
*/
public class Main {
private final static Logger LOG = LoggerFactory.getLogger(Main.class);
public static void main(String[] args) throws InterruptedException {
Main m = new Main();
// Get the connectable
ConnectableFlux<Object> flux = m.fluxPrintTime();
// Subscribe some listeners
// Tried using a new thread for the subscribers, but the connect call still blocks
LOG.info("Subscribing");
Disposable disposable = flux.subscribe(e -> LOG.info("Fast 1 - {}", e));
Disposable disposable2 = flux.subscribe(e -> LOG.info("Fast 2 - {}", e));
LOG.info("Connecting...");
Disposable connect = flux.connect();// WHY does this block??
LOG.info("Connected..");
// Sleep 5 seconds
TimeUnit.SECONDS.sleep(5);
// Cleanup - Remove listeners
LOG.info("Disposing");
connect.dispose();
disposable.dispose();
disposable2.dispose();
LOG.info("Disposed called");
}
// Just create a test flux
public ConnectableFlux<Object> fluxPrintTime() {
return Flux.create(fluxSink -> {
while (true) {
fluxSink.next(System.currentTimeMillis());
}
}).doOnSubscribe(ignore -> LOG.info("Connecting to source"))
.sample(ofSeconds(2))
.publish();
}
}
Running the above code gives the following output.. it just prints the time in milliseconds until I Ctrl-C the process..
09:36:21.463 [main] DEBUG reactor.util.Loggers$LoggerFactory - Using Slf4j logging framework
09:36:21.478 [main] INFO Main - Subscribing
09:36:21.481 [main] INFO Main - Connecting...
09:36:21.490 [main] INFO Main - Connecting to source
09:36:23.492 [parallel-1] INFO Main - Fast 1 - 1589808983492
09:36:23.493 [parallel-1] INFO Main - Fast 2 - 1589808983492
09:36:25.493 [parallel-1] INFO Main - Fast 1 - 1589808985493
09:36:25.493 [parallel-1] INFO Main - Fast 2 - 1589808985493
09:36:27.490 [parallel-1] INFO Main - Fast 1 - 1589808987490
09:36:27.490 [parallel-1] INFO Main - Fast 2 - 1589808987490
09:36:29.493 [parallel-1] INFO Main - Fast 1 - 1589808989493
...
I received an answer from the Spring Reactor team and I'm just posting it here in case anyone else runs into this...
The crux of the issue is that you're entering an infinite loop in
Flux.create. The moment the flux gets subscribed, it will enter the
loop and never exit it, producing data as fast as the CPU can. With
Flux.create you should at least have a call to sink.complete() at some
point.
I suggest to experiment with eg. Flux.interval as a source for your
regular ticks, it will get rid of that extraneous complexity of
Flux.create, which puts you in charge of lower level concepts of
Reactive Streams (the onNext/onComplete/onError signals, that you'll
need to learn about, but maybe not just right now 😄 ).
As a side note, I would take into consideration that emulating a
listener-based API with Reactor (or RxJava) is not doing justice to
what reactive programming can do. It is a constrained use case that
will probably drive your focus and expectations away from the real
benefits of reactive programming
From a higher perspective:
The broad idea of ConnectableFlux#connect() is that you have a
"transient" source that you want to share between multiple
subscribers, but it gets triggered the moment someone subscribes to
it. So in order not to miss any event, you turn the source into a
ConnectableFlux, perform some set up (subscribe several subscribers)
and manually trigger the source (by calling connect()). It is not
blocking, and returns a Disposable` that represents the upstream
connection (in case you also want to manually cancel/dispose the whole
subscription).
PS: Bismuth is now clearly outdated, prefer using the latest
Dysprosium release train

ERROR Finalize failed with exception #<RuntimeError: No container can run this application. Please ensure that you've pushed a valid JVM artifact or

I am getting below error while deploying the Spring Boot & SOAP project on Pivotal Cloud Foundry(PCF). I am using Spring Boot v2.2.2.RELEASE.?
Staging app and tracing logs...
Downloading java_buildpack_offline...
Downloaded java_buildpack_offline
Cell f56fc3fb-e622-43a8-96cf-9be14d95c348 creating container for instance f847285e-0bb1-4015-b01c-4b0d84f992bf
Cell f56fc3fb-e622-43a8-96cf-9be14d95c348 successfully created container for instance f847285e-0bb1-4015-b01c-4b0d84f992bf
Downloading app package...
Downloaded app package (473.1K)
[1m[31m----->[0m[22m [1m[34mJava Buildpack[0m[22m [34mv4.26[0m [34m(offline)[0m | https://github.com/cloudfoundry/java-buildpack.git#e06e00b
[Buildpack] ERROR Finalize failed with exception #<RuntimeError: No container can run this application. Please ensure that you've pushed a valid JVM artifact or artifacts using the -p command line argument or path manifest entry. Information about valid JVM artifacts can be found at https://github.com/cloudfoundry/java-buildpack#additional-documentation. >
No container can run this application. Please ensure that you've pushed a valid JVM artifact or artifacts using the -p command line argument or path manifest entry. Information about valid JVM artifacts can be found at https://github.com/cloudfoundry/java-buildpack#additional-documentation.
Failed to compile droplet: Failed to run finalize script: exit status 1
Exit status 223
Cell stopping instance
Cell destroying container for instance
Cell successfully destroyed container for instance
Error staging application: App staging failed in the buildpack compile phase
Resolved this by adding a repackage goal to the spring boot maven plugin in the maven build section:
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.2.4.RELEASE</version>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
I faced the same issue issue and it was because of missing path property in manifest.yml.
applications:
name: my-app
path: target/application-0.0.1-SNAPSHOT.jar

ERROR: Stubs could not be found. Please make sure that spring-cloud-contract:convert was invoked

Using Spring Cloud Contract 2.1.3.RELEASE with spring-boot 2.1.1.RELEASE, I have added the dependency and plugin per explained in the guide:
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-contract-verifier</artifactId>
<version>${spring-cloud-contract.version}</version>
<scope>test</scope>
</dependency>
and
<plugin>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-contract-maven-plugin</artifactId>
<version>${spring-cloud-contract.version}</version>
<extensions>true</extensions>
</plugin>
I have also added under: $rootDir/src/test/resources/contracts:
Groovy file:
package contracts
import org.springframework.cloud.contract.spec.Contract
Contract.make {
name("contract_updateNodeV4")
request {
method 'PUT'
url '/v4/nodes'
headers {
header 'Content-Type': 'application/vnd.org.springframework.cloud.contract.verifier.twitter-places-analyzer.v1+json'
}
body(file("updateNodeV4_request.json"))
}
response {
status OK()
body(file("updateNodeV4_response.json"))
}
}
And corresponding updateNodeV4_request.json and updateNodeV4_response.json (omitting their contents since these are large) valid JSON files.
When running mvn clean install I expected generated tests to be created (and fail for now) per the guide.
Instead I am getting the following error:
[ERROR] Failed to execute goal org.springframework.cloud:spring-cloud-contract-maven-plugin:1.0.0.RELEASE:generateStubs (default-generateStubs) on project xxx: Stubs could not be found: [C:\Users\xxx\git\xxx\target\stubs] .
[ERROR] Please make sure that spring-cloud-contract:convert was invoked
Most likely your contacts are not under the module's src/test/resources/contracts but under the root module's folder. If that's the case you need to tell the plugin that by seeing the contracts dir plugin property
I solved it by moving the plugin:
<plugin>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-contract-maven-plugin</artifactId>
<version>${spring-cloud-contract.version}</version>
<extensions>true</extensions>
</plugin>
From the root pom.xml to the specific module's pom.xml in which I have created the contracts in. Now it works as expected.

Spring embedded kafka fails when I'm running tests

I have a problem with spring embedded kafka
which I want to use to test my kafka sender/receiver.
When I try to run my tests using:
#RunWith(MockitoJUnitRunner.class)
#SpringBootTest
#DirtiesContext
public class myTestClass {
#ClassRule
public static EmbeddedKafkaRule embeddedKafka =
new EmbeddedKafkaRule(1, true, RECEIVER_TOPIC);
#Test public void test() {
System.out.println("#Test");
}
}
I get an error:
java.io.IOException: Failed to load C:\Users\username\AppData\Local\Temp\kafka-8251150311475880576 during broker startup
...
15:29:33.135 [main] ERROR kafka.log.LogManager - Shutdown broker because none of the specified log dirs from C:\Users\username\AppData\Local\Temp\kafka-8251150311475880576 can be created or validated
Im sure that as a user I have access to this directory and when I'm running tests I can see that kafka is creating this kind of diretories (empty) on temp folder
but still not working.
This is my pom configuration:
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka-test</artifactId>
<version>2.2.0.RC1</version>
</dependency>
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
<version>2.2.0.RC1</version>
<scope>test</scope>
</dependency>
appliacation.properties:
kafka.bootstrap-servers= ${spring.embedded.kafka.brokers}
The interesting thing is that I have an example of kafka embedded used for testing purposes cloned from the internet and it just works fine
but when I apply it to my project it just crushes as above.
Any suggestions what am I doing wrong?

Spring MVC with boot and NEO4j

I am building an application where Rest endpoints uses, boot bindings to send Neo4j Entitites to Neo4j Repo, and thats the pattern we repeated few times across jpa backed repositories. In current configuration;
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-neo4j</artifactId>
</dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
JPA repositories and neo4j repositories works ok, until i add;
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
Erros i am facing are;
APPLICATION FAILED TO START
Description:
Constructor in org.springframework.data.neo4j.repository.config.Neo4jOgmEntityInstantiatorConfigurationBean required a single bean, but 2 were found:
- mvcConversionService: defined by method 'mvcConversionService' in class path resource [org/springframework/boot/autoconfigure/web/servlet/WebMvcAutoConfiguration$EnableWebMvcConfiguration.class]
- integrationConversionService: defined in null
:: Spring Boot :: (v2.1.0.M2)
:: Java 8
Updating project dependencies to
<spring.boot.version>2.1.0.RELEASE</spring.boot.version>
<spring.cloud.version>2.1.0.RC3</spring.cloud.version>
has fixed the issue as per #meistermeier suggestion.

Resources