JGit ProgressMonitor reports wrong number of tasks - jgit

I am using JGit libs to implement basic git operations.
Javadoc for ProgressMonitor:
http://archive.eclipse.org/jgit/docs/jgit-2.0.0.201206130900-r/apidocs/org/eclipse/jgit/lib/ProgressMonitor.html
Here, you can see the log when I use a ProgressMonitor for git clone (implementation in the end):
log start
Total tasks: 2
Task started: remote: Enumerating objects with total work: 0
Task started: remote: Counting objects with total work: 56
Task started: remote: Compressing objects with total work: 46
Task started: Receiving objects with total work: 56
Task ended: Receiving objects
Task started: Resolving deltas with total work: 1
Task ended: Resolving deltas
Task started: Checking out files with total work: 23
Task ended: Checking out files
log ended
Now the question:
According to the method start, there are 2 tasks, but the method beginTask is reporting 6 tasks started and the method endTask is reporting 3 tasks ended... Can anybody explain to me this magic?
The cloning process itself finished successfully.
Below you can see the implementation of ProgressMonitor with methods writing to the log.
import org.eclipse.jgit.lib.ProgressMonitor;
public class MyProgressMonitor implements ProgressMonitor {
String currentTaskTitle;
#Override
public void start(int totalTasks) {
System.out.println("Total tasks: " + totalTasks);
}
#Override
public void beginTask(String title, int totalWork) {
currentTaskTitle = title;
System.out.println("Task started: " + title + " with total work: " + totalWork);
}
#Override
public void update(int completed) {
}
#Override
public void endTask() {
System.out.println("Task ended: " + currentTaskTitle);
}
#Override
public boolean isCancelled() {
return false;
}
}

Related

spring #Scheduled issue using static global variable

In my spring boot Application, I have a scheduled task to be daily executed, I am also using two global variables to make sure the work has been done correctly; so the code is like this:
#Service
#EnableScheduling
public class DailyService {
static long sleep=0; // will be used in sleep()
static boolean done=false; //only turns true when the task is correctly complited
#Scheduled(cron = "0 0 1 * * *")//every 01:00:00
void daily() throws InterruptedException
{
do
{
Thread.sleep(sleep); //0 ms if last execution was ok else 30mins
System.out.println(daily());// OK or ERROR
System.out.println(done + " from daily()");//done is the boolean :)
} while(!done);
}
public String process() throws InterruptedException{ //InterruptedException required for sleep()
try {
//some work download file using RestTemplate
work();
sleep=0;
done=true;
System.out.println(done+ " from process()");
return "OK";
}catch (Exception e){
System.out.println(e.getMessage());
sleep =30*60*1000;//30min to wait before the next execution
done=false;
return "ERROR";
}
}
}
as a result process() keeps executing infinitely and in terminal I have an infinity of:
OK
false from daily()
with no "true from process()" ! and done never turned true even if everything is ok!
what is wrong in this code? and how should it be done?

how to deploy a kafka consumer being in pause mode until i signal to start consume the messages

I am using spring-kafka 2.2.8 and trying to understand if there is an option to deploy a kafka consumer being in pause mode until i signal to start consume the messages. Please suggest.
I see in the below post, we can pause and start the consumer but I need the consumer to be in pause mode when it's deployed.
how to pause and resume #KafkaListener using spring-kafka
#KafkaListener(id = "foo", ..., autoStartup = "false")
Then start it using the KafkaListenerEndpointRegistry when you are ready
registry.getListenerContainer("foo").start();
There is not much point in starting it in paused mode, but you can do that...
#SpringBootApplication
public class So62329274Application {
public static void main(String[] args) {
SpringApplication.run(So62329274Application.class, args);
}
#KafkaListener(id = "so62329274", topics = "so62329274", autoStartup = "false")
public void listen(String in) {
System.out.println(in);
}
#Bean
public NewTopic topic() {
return TopicBuilder.name("so62329274").partitions(1).replicas(1).build();
}
#Bean
public ApplicationRunner runner(KafkaListenerEndpointRegistry registry, KafkaTemplate<String, String> template) {
return args -> {
template.send("so62329274", "foo");
registry.getListenerContainer("so62329274").pause();
registry.getListenerContainer("so62329274").start();
System.in.read();
registry.getListenerContainer("so62329274").resume();
};
}
}
You will see a log message like this when the partitions are assigned:
Paused consumer resumed by Kafka due to rebalance; consumer paused again, so the initial poll() will never return any records

Allure report logged only the first fail and the test ends and doesn't run all steps after the first fail

I'm using Java+TestNG+Allure. I need to get all test fails in Allure report, not only the first fail of the test but all, and the test should run from the beginning to the end despite failed steps.
For reporting the test failures in Allure report we have to do little bit of modifications in Allure Class. Here we want to report any of the sub step as a failure, execute the remaining steps and then mark the main test step as a failed test. For doing this we can use the concept of SoftAssertions. I had created one class called as AllureLogger. Inside the class we will have 5 Methods.
1)starttest() 2)endtest() 3) markStepAsPassed(String message) 4)marstepAsFailed(String message) 5)logStep().
public class AllureLogger {
public static Logger log = Logger.getLogger("devpinoylog");
private static StepResult result_fail;
private static StepResult result_pass;
private static String uuid;
private static SoftAssert softAssertion;
public static void startTest() {
softAssertion = new SoftAssert();
}
public static void logStep(String discription) {
log.info(discription);
uuid = UUID.randomUUID().toString();
result_fail = new StepResult().withName(discription).withStatus(Status.FAILED);
result_pass = new StepResult().withName(discription).withStatus(Status.PASSED);
}
public static void markStepAsFailed(WebDriver driver, String errorMessage) {
log.fatal(errorMessage);
Allure.getLifecycle().startStep(uuid, result_fail);
Allure.getLifecycle().addAttachment(errorMessage, "image", "JPEG", ((TakesScreenshot) driver).getScreenshotAs(OutputType.BYTES));
Allure.getLifecycle().stopStep(uuid);
softAssertion.fail(errorMessage);
}
public static void markStepAsPassed(WebDriver driver, String message) {
log.info(message);
Allure.getLifecycle().startStep(uuid, result_pass);
Allure.getLifecycle().stopStep(uuid);
}
public static void endTest() {
softAssertion.assertAll();
softAssertion = null;
startTest();
softAssertion = new SoftAssert();
}
}
In the above class, we are using different methods from allureClass and we are doing little bit of modification to add soft assertions.
Everytime we start a TestMethod in testClass we can call the starttest() and end testmethod().Inside the test methods if we have some substeps we can use try catch block to mark the substeps as pass or fail.Ex-Please check the below test method as an Example
#Test(description = "Login to application and navigate to Applications tab ")
public void testLogin()
{
AllureLogger.startTest();
userLogin();
navigatetoapplicationsTab();
AllureLogger.endTest();
}
Above is a test method which will login to one application and then navigate to application tab.Inside we have two methods which will be reported as substeps, 1)login()- For logging in the application 2) navigatetoapplicationsTab()-to navigate to application tab. If any of the substep fails then the main step and substep will be marked as fail and remaining steps will be executed.
We will define the body of the above functions which are defined in test method as below:
userLogin()
{
AllureLogger.logStep("Login to the application");
try
{
/*
Write the logic here
*/
AllureLogger.MarStepAsPassed(driver,"Login successful");
}
catch(Exception e)
{
AllureLogger.MarStepAsFailed(driver,"Login not successful");
}
}
navigatetoapplicationsTab()
{
AllureLogger.logStep("Navigate to application Tab");
try
{
/*
Write the logic here
*/
AllureLogger.MarStepAsPassed(driver,"Navigate to application Tab successful");
}
catch(Exception e)
{
e.printStackTrace();
AllureLogger.MarStepAsFailed(driver,"Navigate to application Tab failed");
}
}
Everytime any exception is thrown they will be caught in catch block and reported in the Allure Report. The soft assertion enables us to execute all the remaining steps successfully.
Attached is a screenshot of an Allure report generated by using the above technique.The main step is marked as Failed and remaining test cases have got executed.
The report attached here is not from the above example which is mentioned. It is just a sample as how the report would look.

Spring Boot Actuator to Give CPU Usage

I would like to know is there a way to get the CPU Usage metrics with Spring Boot Actuator? Im able to see other metrics with /metrics and /health endpoints but not getting the CPU Usage.
I want to avoid writing an extra class just to see the CPU Usage.
Any idea?
Thanks
Just checked and I found this actuator... /actuator/metrics/process.cpu.usage
It outputs the following:
{
name: "process.cpu.usage",
description: "The "recent cpu usage" for the Java Virtual Machine process",
baseUnit: null,
measurements: [
{
statistic: "VALUE",
value: 0.0001742149747252696
}
],
availableTags: [ ]
}
Currently using Spring Boot version 2.2.2.RELEASE.
Spring Boot 2 actuator solution (building on #diginoise's code to measure CPU load), registering a Gauge with a function to measure the value when requested (no need to start Threads or schedule timers):
#Component
public class CpuMetrics {
private final static String METRICS_NAME = "process.cpu.load";
#Autowired
private MeterRegistry meterRegistry;
#PostConstruct
public void init() {
Gauge.builder(METRICS_NAME, this, CpuMetrics::getProcessCpuLoad)
.baseUnit("%")
.description("CPU Load")
.register(meterRegistry);
}
public Double getProcessCpuLoad() {
try {
MBeanServer mbs = ManagementFactory.getPlatformMBeanServer();
ObjectName name = ObjectName.getInstance("java.lang:type=OperatingSystem");
AttributeList list = mbs.getAttributes(name, new String[]{"ProcessCpuLoad"});
return Optional.ofNullable(list)
.map(l -> l.isEmpty() ? null : l)
.map(List::iterator)
.map(Iterator::next)
.map(Attribute.class::cast)
.map(Attribute::getValue)
.map(Double.class::cast)
.orElse(null);
} catch (Exception ex) {
return null;
}
}
}
The CPU metrics will then be available at /actuator/metrics/process.cpu.load:
{
"name": "process.cpu.load",
"description": "CPU Load",
"baseUnit": "%",
"measurements": [
{
"statistic": "VALUE",
"value": 0.09767676212004521
}
],
"availableTags": []
}
Unfortunately there isn't a CPU metric available via Spring Boot Actuator.
Fortunately you could write your own.
Just create a measuring bean which fulfills the following:
It has access to GaugeService as it will be tracking one value.
#Autowired
private GaugeService gaugeService;
Creates a thread which calls routine to measure process' CPU load:
#PostConstruct
public void startMeasuring() {
new Thread() {
#Override
public void run() {
gaugeService.submit("process.cpu.load", getProcessCpuLoad());
Thread.sleep(2000); //measure every 2sec.
}
}.start();
}
Has a routine which gets CPU load for your process using MxBeans:
public static double getProcessCpuLoad() throws Exception {
MBeanServer mbs = ManagementFactory.getPlatformMBeanServer();
ObjectName name = ObjectName.getInstance("java.lang:type=OperatingSystem");
AttributeList list = mbs.getAttributes(name, new String[]{ "ProcessCpuLoad" });
if (list.isEmpty()) return Double.NaN;
Attribute att = (Attribute)list.get(0);
Double value = (Double)att.getValue();
// usually takes a couple of seconds before we get real values
if (value == -1.0) return Double.NaN;
// returns a percentage value with 1 decimal point precision
return ((int)(value * 1000) / 10.0);
}
You could also extract the system wide CPU load using this method.
Hope this helps.

MyBatis Operation Gets Blocked in Spring Boot Async Method

In my project based on Spring Boot 1.3.3, I integrated MyBatis with mybatis-spring-boot-starter 1.1.1 as persistence layer, all CRUD operation seems working fine separately, but the integration tests failed and I found the DB operation gets blocked in asynchronous task.
The test code looks like this:
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = SapiApplication.class)
#Transactional
public class OrderIntegrationTest {
#Test
public void shouldUpdateOrder() throws InterruptedException{
Order order1 = getOrder1();
orderService.createOrder(order1);
Order order1updated = getOrder1Updated();
orderService.updateOrderAsync(order1updated);
Thread.sleep(1000l);
log.info("find the order!");
Order order1Db = orderService.findOrderById(order1.getOrderId());
log.info("found the order!");
assertEquals("closed", order1Db.getStatus());
}
}
The expected execution order is createOrder() -> updateOrderAsync() -> findOrderById(), but actually the execution order is createOrder() -> updateOrderAsync() started and blocked -> findOrderById() -> updateOrderAsync() continued and ended.
Log:
16:23:04.261 [executor1-1] INFO c.s.api.web.service.OrderServiceImpl - updating order: 2884384
16:23:05.255 [main] INFO c.s.a.w.service.OrderIntegrationTest - find the order!
16:23:05.280 [main] INFO c.s.a.w.service.OrderIntegrationTest - found the order!
16:23:05.299 [executor1-1] INFO c.s.api.web.service.OrderServiceImpl - updated order: 2884384
Other related code:
#Service
public class OrderServiceImpl implements OrderService {
#Autowired
private OrderDao orderDao;
#Async("executor1")
#Override
public void updateOrderAsync(Order order){
log.info("updating order: {}", order.getOrderId());
orderDao.updateOrder(order);
log.info("updated order: {}", order.getOrderId());
}
}
The DAO:
public interface OrderDao {
public int updateOrder(Order order);
public int createOrder(Order order);
public Order findOrderById(String orderId);
}
The Gradle dependencies:
dependencies {
compile 'org.springframework.boot:spring-boot-starter-jdbc'
compile 'org.springframework.boot:spring-boot-starter-security'
compile 'org.springframework.boot:spring-boot-starter-web'
compile 'org.springframework.boot:spring-boot-starter-actuator'
compile 'org.mybatis.spring.boot:mybatis-spring-boot-starter:1.1.1'
compile 'ch.qos.logback:logback-classic:1.1.2'
compile 'org.springframework.boot:spring-boot-configuration-processor'
runtime 'mysql:mysql-connector-java'
providedRuntime 'org.springframework.boot:spring-boot-starter-tomcat'
testCompile 'org.springframework.boot:spring-boot-starter-test'
testCompile "org.springframework.security:spring-security-test"
}
The Spring configuration:
#SpringBootApplication
#EnableAsync
#EnableCaching
#EnableScheduling
#MapperScan("com.sapi.web.dao")
public class SapiApplication {
#Bean(name = "executor1")
protected Executor taskExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(5);
executor.setMaxPoolSize(100);
return executor;
}
#Bean
#Primary
#ConfigurationProperties(prefix = "datasource.primary")
public DataSource numberMasterDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "secondary")
#ConfigurationProperties(prefix = "datasource.secondary")
public DataSource provisioningDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "jdbcTpl")
public JdbcTemplate jdbcTemplate(#Qualifier("secondary") DataSource dsItems) {
return new JdbcTemplate(dsItems);
}
public static void main(String[] args) {
SpringApplication.run(SapiApplication.class, args);
}
}
The properties:
mybatis.mapper-locations=classpath*:com/sapi/web/dao/*Mapper.xml
mybatis.type-aliases-package=com.sapi.web.vo
datasource.primary.driver-class-name=com.mysql.jdbc.Driver
datasource.primary.url=jdbc:mysql://10.0.6.202:3306/sapi
datasource.primary.username=xxx
datasource.primary.password=xxx
datasource.primary.maximum-pool-size=80
datasource.primary.max-idle=10
datasource.primary.max-active=150
datasource.primary.max-wait=10000
datasource.primary.min-idle=5
datasource.primary.initial-size=5
datasource.primary.validation-query=SELECT 1
datasource.primary.test-on-borrow=false
datasource.primary.test-while-idle=true
datasource.primary.time-between-eviction-runs-millis=18800
datasource.primary.jdbc-interceptors=ConnectionState;SlowQueryReport(threshold=100)
datasource.secondary.url = jdbc:mysql://10.0.6.202:3306/xdb
datasource.secondary.username = xxx
datasource.secondary.password = xxx
datasource.secondary.driver-class-name = com.mysql.jdbc.Driver
logging.level.org.springframework.web=DEBUG
The problem you see is caused by the fact that the whole test method shouldUpdateOrder is executed in one transaction. This means that any update operation that is executed in the thread that runs shouldUpdateOrder locks the record for the whole duration of the transaction (that is till exit from test method) and that record cannot be updated by another concurrent transaction (that is executed in async method).
To solve the issue you need to change transactions boundaries. In your case the correct way to emulate real life usage is to
create order in one transaction and finish the transaction
update order in another transaction
check that update is executed as expected in yet another transaction

Resources