OpenJDK JMH - What means to use #Params in a measurement? - jmh

I am new to JMH and I am trying to understand how #Params are applyed during a benchmark. Here is the code I am using:
public class BenchmarkMapper
{
#State (Scope.Benchmark)
public static class ExecutionPlan
{
public Source source;
public Mapper mapper;
#Param (
{ "100", "1000", "10000", "100000", "1000000" })
public int iterations;
#Setup (Level.Invocation)
public void setUp()
{
this.source = BenchmarkUtils.createSource();
this.mapper = new Mapper();
}
}
#Benchmark
public Map<Integer, Object> testMap(ExecutionPlan plan)
{
Map<Integer, Object> resultMap = new HashMap<>();
for (int index = plan.iterations; index > 0; index--)
{
resultMap.put(index, plan.mapper.map(plan.source));
}
return resultMap;
}
}
After running it I saw those Parameter iterations being applied:
So, what is affected in the Benchmark when I use these #Params? What is the difference in using #Params or #Measurements annotations?
Thanks in advance
James

As stated in JMH examples, #Params annotation enables you to try different configurations through your benchmark. #Measurement annotation enables you to set measurement based settings for your benchmark. For example, you can set number of iterations in each fork or you can set time unit of your benchmark with #Measurement annotation. Number of calculations for your benchmark is calculated as forkNumber * measurementIterations * numberOfParams. I hope it helps!

Related

spring-kafka kafkaStreamsBuilder.getKafkaStreams() is null

Here is my code
The first bean is watching the messages on Topic.TRANSACTION_RAW and split one message into two and send them to Topic.TRANSACTION_INTERNAL
And the second bean is doing group and reducing and materialize it to the state store "StateStore.BALANCE".
The last one is to get the ReadOnlyKeyValueStore to read state from "ReadOnlyKeyValueStore".
#Configuration(proxyBeanMethods = false)
#EnableKafkaStreams
public class MyKafkaStreamsConfiguration {
#Bean
public KStream<String, BankTransaction> alphaBankKStream(StreamsBuilder streamsBuilder) {
JsonSerde<BankTransaction> valueSerde = new JsonSerde<>(BankTransaction.class);
KStream<String, BankTransaction> stream = streamsBuilder.stream(Topic.TRANSACTION_RAW,
Consumed.with(Serdes.String(), valueSerde));
stream.flatMap((k, v) -> {
List<BankTransactionInternal> txInternals = BankTransactionInternal.splitBankTransaction(v);
List<KeyValue<String, BankTransactionInternal>> result = new LinkedList<>();
result.add(KeyValue.pair(v.getFromAccount(), txInternals.get(0)));
result.add(KeyValue.pair(v.getToAccount(), txInternals.get(1)));
return result;
}).filter((k, v) -> !Constants.EXTERNAL_ACCOUNT.equalsIgnoreCase(k))
.to(Topic.TRANSACTION_INTERNAL, Produced.with(Serdes.String(), new JsonSerde<>()));
return stream;
}
#Bean
public KStream<String, BankTransactionInternal> alphaBankInternalKStream(StreamsBuilder streamsBuilder) {
JsonSerde<BankTransactionInternal> valueSerde = new JsonSerde<>(BankTransactionInternal.class);
KStream<String, BankTransactionInternal> stream = streamsBuilder.stream(Topic.TRANSACTION_INTERNAL,
Consumed.with(Serdes.String(), valueSerde));
KGroupedStream<String, Double> groupedByAccount = stream
.map((k,v) -> KeyValue.pair(k, v.getAmount()))
.groupBy((account, amount) -> account, Grouped.with(Serdes.String(), Serdes.Double()));
groupedByAccount.reduce(Double::sum,
Materialized.<String, Double, KeyValueStore<Bytes, byte[]>>as(StateStore.BALANCE)
.withValueSerde(Serdes.Double()));
return stream;
}
#Bean
public ReadOnlyKeyValueStore<String, Double> balanceStateStore(StreamsBuilderFactoryBean defaultKafkaStreamsBuilder) {
if (defaultKafkaStreamsBuilder == null) {
System.out.println("... defaultKafkaStreamsBuilder is null ...");
}
if (defaultKafkaStreamsBuilder.getKafkaStreams() == null) {
System.out.println("... defaultKafkaStreamsBuilder.getKafkaStreams() is null ...");
// this one got printed
}
ReadOnlyKeyValueStore<String, Double> store = defaultKafkaStreamsBuilder.getKafkaStreams().store(
StateStore.BALANCE,
QueryableStoreTypes.keyValueStore());
return store;
}
}
I always got NullPointException on defaultKafkaStreamsBuilder.getKafkaStreams().
Any idea what is wrong here? Thanks!
if (defaultKafkaStreamsBuilder.getKafkaStreams() == null) {
System.out.println("... defaultKafkaStreamsBuilder.getKafkaStreams() is null ...");
// this one got printed
}
This operation is not good to do during bean definition phase.
See its JavaDocs:
/**
* Get a managed by this {#link StreamsBuilderFactoryBean} {#link KafkaStreams} instance.
* #return KafkaStreams managed instance;
* may be null if this {#link StreamsBuilderFactoryBean} hasn't been started.
* #since 1.1.4
*/
public synchronized KafkaStreams getKafkaStreams() {
since you call this method far too early before a lifecycle start phase, you end-up with that error.
You should reconsider your logic in favor of SmartLifecycle.start() in the target service where you'd like to use that ReadOnlyKeyValueStore. So, you autowire over there this StreamsBuilderFactoryBean and call its getKafkaStreams() from the start() implementation.

How I get real time statistics in MOEA Framework?

I know there is the Instrumenter class, however this method outputs the data after the run finish. I would like to get (near) real-time data, like in the Symbolic Regression in the Demos.
Looking at its code, it seems I need to use the step method and try to imitate the runSingleSeed in Executor. Is there a better way? Some other class like Instrumenter but asynchronous. I cannot really find something similar online.
Just build a wrapper around the cycle (similar to the next one) and make it also a subject in an observer pattern.
import java.util.Properties;
import java.util.Arrays;
import java.text.DecimalFormat;
import org.moeaframework.core.Algorithm;
import org.moeaframework.core.Solution;
import org.moeaframework.core.Problem;
import org.moeaframework.core.Population;
import org.moeaframework.core.NondominatedPopulation;
import org.moeaframework.core.variable.EncodingUtils;
import org.moeaframework.core.spi.AlgorithmFactory;
import org.moeaframework.problem.misc.Kursawe;
public class Main{
public static void main(String[] args){
String algorithmName = "NSGAII";
Properties properties = new Properties();
properties.setProperty("populationSize", "100"); // to change properties
Problem problem = new Kursawe();
Algorithm algorithm = AlgorithmFactory.getInstance()
.getAlgorithm(algorithmName, properties, problem);
int maxGenerations = 100;
int generation = 0;
while( generation < maxGenerations ){
if( generation % 10 == 1 ){
System.out.println("Generation " + generation);
NondominatedPopulation paretoFront = algorithm.getResult();
// metrics
System.out.print("One of the pareto front: ");
System.out.println(toString(paretoFront.get(0)));
}
algorithm.step();
generation++;
}
algorithm.terminate();
System.out.println("Parento Front:");
for(Solution solution: algorithm.getResult()){
System.out.println(toString(solution));
}
export(algorithm.getResult());
}
private static String toString(Solution solution){
StringBuilder out = new StringBuilder();
double[] variables = EncodingUtils.getReal(solution);
double[] objectives = solution.getObjectives();
out.append("f");
out.append(doubleArrayToString(variables));
out.append(" = ");
out.append(doubleArrayToString(objectives));
return out.toString();
}
private static String doubleArrayToString(double[] array){
DecimalFormat format = new DecimalFormat("+#,##0.00;-#");
StringBuilder out = new StringBuilder();
out.append("[");
for(int i = 0; i < array.length-1; i++){
out.append(format.format(array[i]));
out.append(", ");
}
out.append(format.format(array[array.length-1]));
out.append("]");
return out.toString();
}
private static void export(Population population){
System.out.println();
for(Solution solution: population){
double[] objectives = solution.getObjectives();
System.out.println(String.format("%.3f,%.3f", objectives[0], objectives[1]));
}
}
}
Another option for the one indicated by Black Arrow, if you are using multithread, is to extentend AlgorithmFactory. For example:
public class MyAlgorithmFactory extends AlgorithmFactory {
private static Algorithm algorithm;
public Algorithm getGeneratedAlgorithm() {
return this.algorithm;
}
#Override
public Algorithm getAlgorithm(String name, Properties properties, Problem problem){
this.algorithm = super.getAlgorithm(name, properties, problem);
return algorithm;
}
}
Then you use this Factory on your Executor, for example:
MyAlgorithmFactory af = new MyAlgorithmFactory();
Executor executor = new Executor()
.usingAlgorithmFactory(af)
.withAlgorithm("NSGAII") //
.withProblem(yourProblemHere) //
.withMaxEvaluations(10000);
After this you can start the Executor on a separated thread, and call af.getGeneratedAlgorithm() to get the instance of Algorithm initialized by the Executor. From this Algorithm you can get, while the Executor is still running, the actual NondominatedPopulation to calc statistics.

Setting up TableColumns Value using Generic Types

I wanted to program a TableBrowser for a MYSQl Database in JavaFX.
My first problem is: i dont know which types i get back from the Database.
So i decided to wrap those types with a Wrapper-class.
To show these values on the GUI, i used the TableColumns setCellValueFactory-method, which
needs a value, that implements ObservableValue.
So i tried to implement the ObservableValue-interface.
But when i run the program it doesnt show the right Values.
TableBrowser after connecting to the Database
Has anyone an idea where i did wrong or knows a more recommended way to implement it ?
Here is the Part of the Code from the TableBrowser
/*
* this variable is used to iterate over the tableview's columns.
* It is a class variable, because it is not possible (for some reasons)
* to use a local variable while working with it in the context of Lambda-expressions
*/
int t = 0;
// those two variables are defined in the class Body
private final TableView<Entry> tableview = new TableView<>();
private final ObservableList<Entry> columndata = FXCollections.observableArrayList();
// the following Code is inside the Button's Actionlistener
for(int i = 1; i <= maxcol; i++) // adds a new TableColum for every colum in the DB
{
tableview.getColumns().add(new TableColumn<Entry, String>rsmd.getColumnName(i)));
}
// iterates over the ResultSet
while(rs.next())
{
// this is the dataset i put in my TableView
Entry row = new Entry(maxcol);
// for each Column i add the columnvalue to the current dataset
for(int i = 1; i <= maxcol; i++)
{
int type = rsmd.getColumnType(i);
Object value = rs.getObject(i);
row.setCellValue(i-1, type, value);
}
// adds a new dataset to the ObservableList<Entry>
columndata.add(row);
}
// puts all datasets in the TableView
tableview.setItems(columndata);
// iterates over all Columns
for(t = 0; t < tableview.getColumns().size(); t++)
{
// should set the CellValueFactory for each Column so it shows the data
/*
* I apologise if there a horrible mistake.
* I never worked with Lamda before and just copied it form an example page :)
*/
tableview.getColumns().get(t).setCellValueFactory(celldata -> celldata.getValue().getCellValue(t-1));
}
This is my Entry class, which is an inner Class in TableBrowserclass
/*
* should represent a Dataset.
* Has an array, which holdes every columnvalue as a WrapperType
*/
private class Entry
{
WrapperType<?>[] columns;
private Entry(int columncount)
{
columns = new WrapperType[columncount];
}
private WrapperType<?> getCellValue(int col)
{
return columns[col];
}
private void setCellValue(int col, int type, Object value)
{
columns[col] = MySQLTypeWrapper.getInstance().wrapType(type, value);
}
}
Here is the MySQLTypeWrapper class, which holds the WrapperType as an inner class
public class MySQLTypeWrapper
{
public WrapperType<?> wrapType(int type, Object Value)
{
Class<?> typeclass = toClass(type);
return new WrapperType<>(typeclass.cast(Value));
}
/*
* returns the appropriate class def for every database type
* Expl: VARCHAR returns String.class
*/
private static Class<?> toClass(int type) {...}
/*
* I copied the content of the of the overridden Methods from StringPropertyBase
* as i have clue how to implement ObservableValue
*/
class WrapperType<T> implements ObservableValue<WrapperType<T>>
{
private T value;
private ExpressionHelper<WrapperType<T>> helper = null;
private WrapperType(T value)
{
this.value = value;
}
#Override
public void addListener(InvalidationListener listener)
{
helper = ExpressionHelper.addListener(helper, this, listener);
}
#Override
public void removeListener(InvalidationListener listener)
{
helper = ExpressionHelper.removeListener(helper, listener);
}
#Override
public void addListener(ChangeListener<? super WrapperType<T>> listener)
{
helper = ExpressionHelper.addListener(helper, this, listener);
}
#Override
public void removeListener(ChangeListener<? super WrapperType<T>> listener)
{
helper = ExpressionHelper.removeListener(helper, listener);
}
#Override
public WrapperType<T> getValue()
{
return this;
}
public String toString()
{
return value.toString();
}
}
}
Thanks for your help in advance :)
As mentioned in the comments, your first problem was not using the TableView's Items property.
For the second part - one solution would be to create a helper method along the lines of
private <T> Callback<TableColumn.CellDataFeatures<Entry,T>,ObservableValue<T>> createCellFactory(int columnIndex) {
return celldata -> celldata.getValue().getCellValue(columnIndex);
}
and then change the loop to
// Now t can be a local variable, as it is not directly passed to the lambda.
for(int t = 0; t < tableview.getColumns().size(); t++)
{
// should set the CellValueFactory for each Column so it shows the data
tableview.getColumns().get(t).setCellValueFactory(createCellFactory(t));
}
Note that this time the variable passed to the lambda is a local effectively-final variable and not an instance variable, so the lambda is created with the correct value every time.
One last word of advice - are you sure you need this amount of generality? What I mean is - it is usually better to create a class to directly represent your DB structure with proper getters and setters, then you can use PropertyValueFactory.

Constraints on parameters in api interface

I've declared an API call in an interface and was wondering if it is possible to put constraints on some of the parameters. The API I'm accessing has these constraints as well and would like to enforce them in my program.
#GET("/recipes/search")
Call<RecipeResponse> getRecipes(
#Query("cuisine") String cuisine,
#Query("diet") String diet,
#Query("excludeIngredients") String excludeIngredients,
#Query("intolerances") String intolerances,
#Query("number") Integer number,
#Query("offset") Integer offset,
#Query("query") String query,
#Query("type") String type
);
How can I do this?
I know that it is possible to do this with POST request, and passing along an object via the RequestBody through the #Body annotation. Can I do this with a GET request too, where information is passed via the query string?
Thanks!
I think I ended up finding a solution. I've made a class SearchRecipeRequest in which I declare all possible parameters as class variables. In the setters I do the data validation such as checking for null on parameters that are required, or min/max value constraints on integers as specified by the endpoint. I then made a SearchRecipeRequestBuilder class to build such an object like so to make it easier to deal with all those possible parameters:
public class SearchRecipeRequestBuilder {
private String _cuisine = null,
_diet = null,
_excludeIngredients = null,
_intolerances = null,
_query = null,
_type = null;
private Integer _number = null,
_offset = null;
public SearchRecipeRequestBuilder() {}
public SearchRecipeRequest buildRequest() {
return new SearchRecipeRequest(_cuisine, _diet, _excludeIngredients, _intolerances, _number, _offset, _query, _type);
}
public SearchRecipeRequestBuilder cuisine(String cuisine) {
_cuisine = cuisine;
return this;
}
public SearchRecipeRequestBuilder diet(String diet) {
_diet = diet;
return this;
}
public SearchRecipeRequestBuilder excludeIngredients(String excludeIngredients) {
_excludeIngredients = excludeIngredients;
return this;
}
public SearchRecipeRequestBuilder intolerances(String intolerances) {
_intolerances = intolerances;
return this;
}
public SearchRecipeRequestBuilder query(String query) {
_query = query;
return this;
}
public SearchRecipeRequestBuilder type(String type) {
_type = type;
return this;
}
public SearchRecipeRequestBuilder number(Integer number) {
_number = number;
return this;
}
public SearchRecipeRequestBuilder offset(Integer offset) {
_offset = offset;
return this;
}
}
Which allows me to build the request like so:
SearchRecipeRequest request = new SearchRecipeRequestBuilder()
.query("burger")
.buildRequest();
I then pass along that object to a different function that knows how to use the request object to pass it along to the API.
That's how I'm doing it right now, if someone has a better way I'd love to hear it. :)
I got the idea to use the Builder pattern from a different StackOverflow question: Managing constructors with many parameters in Java.

Size-limited queue that holds last N elements in Java

A very simple & quick question on Java libraries: is there a ready-made class that implements a Queue with a fixed maximum size - i.e. it always allows addition of elements, but it will silently remove head elements to accomodate space for newly added elements.
Of course, it's trivial to implement it manually:
import java.util.LinkedList;
public class LimitedQueue<E> extends LinkedList<E> {
private int limit;
public LimitedQueue(int limit) {
this.limit = limit;
}
#Override
public boolean add(E o) {
super.add(o);
while (size() > limit) { super.remove(); }
return true;
}
}
As far as I see, there's no standard implementation in Java stdlibs, but may be there's one in Apache Commons or something like that?
Apache commons collections 4 has a CircularFifoQueue<> which is what you are looking for. Quoting the javadoc:
CircularFifoQueue is a first-in first-out queue with a fixed size that replaces its oldest element if full.
import java.util.Queue;
import org.apache.commons.collections4.queue.CircularFifoQueue;
Queue<Integer> fifo = new CircularFifoQueue<Integer>(2);
fifo.add(1);
fifo.add(2);
fifo.add(3);
System.out.println(fifo);
// Observe the result:
// [2, 3]
If you are using an older version of the Apache commons collections (3.x), you can use the CircularFifoBuffer which is basically the same thing without generics.
Update: updated answer following release of commons collections version 4 that supports generics.
Guava now has an EvictingQueue, a non-blocking queue which automatically evicts elements from the head of the queue when attempting to add new elements onto the queue and it is full.
import java.util.Queue;
import com.google.common.collect.EvictingQueue;
Queue<Integer> fifo = EvictingQueue.create(2);
fifo.add(1);
fifo.add(2);
fifo.add(3);
System.out.println(fifo);
// Observe the result:
// [2, 3]
I like #FractalizeR solution. But I would in addition keep and return the value from super.add(o)!
public class LimitedQueue<E> extends LinkedList<E> {
private int limit;
public LimitedQueue(int limit) {
this.limit = limit;
}
#Override
public boolean add(E o) {
boolean added = super.add(o);
while (added && size() > limit) {
super.remove();
}
return added;
}
}
Use composition not extends (yes I mean extends, as in a reference to the extends keyword in java and yes this is inheritance). Composition is superier because it completely shields your implementation, allowing you to change the implementation without impacting the users of your class.
I recommend trying something like this (I'm typing directly into this window, so buyer beware of syntax errors):
public LimitedSizeQueue implements Queue
{
private int maxSize;
private LinkedList storageArea;
public LimitedSizeQueue(final int maxSize)
{
this.maxSize = maxSize;
storageArea = new LinkedList();
}
public boolean offer(ElementType element)
{
if (storageArea.size() < maxSize)
{
storageArea.addFirst(element);
}
else
{
... remove last element;
storageArea.addFirst(element);
}
}
... the rest of this class
A better option (based on the answer by Asaf) might be to wrap the Apache Collections CircularFifoBuffer with a generic class. For example:
public LimitedSizeQueue<ElementType> implements Queue<ElementType>
{
private int maxSize;
private CircularFifoBuffer storageArea;
public LimitedSizeQueue(final int maxSize)
{
if (maxSize > 0)
{
this.maxSize = maxSize;
storateArea = new CircularFifoBuffer(maxSize);
}
else
{
throw new IllegalArgumentException("blah blah blah");
}
}
... implement the Queue interface using the CircularFifoBuffer class
}
The only thing I know that has limited space is the BlockingQueue interface (which is e.g. implemented by the ArrayBlockingQueue class) - but they do not remove the first element if filled, but instead block the put operation until space is free (removed by other thread).
To my knowledge your trivial implementation is the easiest way to get such an behaviour.
You can use a MinMaxPriorityQueue from Google Guava, from the javadoc:
A min-max priority queue can be configured with a maximum size. If so, each time the size of the queue exceeds that value, the queue automatically removes its greatest element according to its comparator (which might be the element that was just added). This is different from conventional bounded queues, which either block or reject new elements when full.
An LRUMap is another possibility, also from Apache Commons.
http://commons.apache.org/collections/apidocs/org/apache/commons/collections/map/LRUMap.html
Ok I'll share this option. This is a pretty performant option - it uses an array internally - and reuses entries. It's thread safe - and you can retrieve the contents as a List.
static class FixedSizeCircularReference<T> {
T[] entries
FixedSizeCircularReference(int size) {
this.entries = new Object[size] as T[]
this.size = size
}
int cur = 0
int size
synchronized void add(T entry) {
entries[cur++] = entry
if (cur >= size) {
cur = 0
}
}
List<T> asList() {
int c = cur
int s = size
T[] e = entries.collect() as T[]
List<T> list = new ArrayList<>()
int oldest = (c == s - 1) ? 0 : c
for (int i = 0; i < e.length; i++) {
def entry = e[oldest + i < s ? oldest + i : oldest + i - s]
if (entry) list.add(entry)
}
return list
}
}
public class ArrayLimitedQueue<E> extends ArrayDeque<E> {
private int limit;
public ArrayLimitedQueue(int limit) {
super(limit + 1);
this.limit = limit;
}
#Override
public boolean add(E o) {
boolean added = super.add(o);
while (added && size() > limit) {
super.remove();
}
return added;
}
#Override
public void addLast(E e) {
super.addLast(e);
while (size() > limit) {
super.removeLast();
}
}
#Override
public boolean offerLast(E e) {
boolean added = super.offerLast(e);
while (added && size() > limit) {
super.pollLast();
}
return added;
}
}

Resources