How to read and filter entities-aggregates based on condition in Axon and after that change them - axon

I am new with Axon and maybe I missed something, but need help to understand.
I have a simple food cart aggregate.
Here is example:
#Aggregate
class FoodCard {
#AggregateIdentifier
private lateinit var foodCardId: UUID
private lateinit var selectedProduct: MutableMap<UUID, Int>
constructor()
#CommandHandler
constructor(command: CreateFoodCartCommand) {
AggregateLifecycle.apply(FoodCartCreateEvent(
UUID.randomUUID()
))
}
#CommandHandler
fun handle(command: SelectProductCommand) {
AggregateLifecycle
.apply(ProductSelectedEvent(foodCardId, command.productId, command.quantity))
}
#CommandHandler
fun handle(command: DeleteFoodCartCommand) {
AggregateLifecycle
.apply(FoodCartDeleteEvent(foodCardId))
}
#CommandHandler
fun handle(command: DeselectProductCommand) {
val productId = command.productId
if (!selectedProduct.containsKey(productId)) {
throw ProductDeselectionException("ProductDeselectionException")
}
AggregateLifecycle
.apply(ProductDeselectEvent(foodCardId, productId, command.quantity))
}
#EventSourcingHandler
fun on(event: FoodCartCreateEvent) {
foodCardId = event.foodCardId
selectedProduct = mutableMapOf()
}
#EventSourcingHandler
fun on(event: ProductSelectedEvent) {
selectedProduct.merge(
event.productId,
event.quantity
) {a, b -> a + b}
}
}
As ES I am using Axon Server.
For FoodCard projector I am using JPA repository that connects to DB.
I want to get all foodcards that contain special product (concrete UUID) and change quantity to -1 for all of them.
I understood there are two types of actions -> read and write
So the question how to correctly implement this flow with Axon?
Thanks

from your explanation and code I feel that you will probably need to complete your implementation of DeselectProductCommand introducing an EventSourcingHandler for ProductDeselectEvent. If I understood correctly your "quantity" information is stored into the selectProduct Map. In this case, based on your code, I see that the information of the quantity that should be subtracted to your product is in the command.
You will also need a Query, such as FindAllFoodCardByProductId, that will retrieve the foodCardId aggregate identifier that contains a certain productId: this operation will be performed on your Projection through the jpa repository.
As a reference you can have a look at the ref guide here https://docs.axoniq.io/reference-guide/implementing-domain-logic/query-handling on how to use QueryGateway into your controller and implement a QueryHandler into your Projection.
Corrado.

Related

Seeding many-to-many databases in EFCore5 with ModelBuilder?

There are many questions about seeding many-to-many relationships in Entity Framework. However, most of them are extremely old, and many-to-many behavior has changed significantly in EFCore5. The official docs recommend overriding OnModelCreating to implement ModelBuilder.Entity<>.HasData().
However, with the new many-to-many behavior (without explicit mappings), I can find no clear path to seed the intermediate tables. To use the example of this tutorial, the BookCategories class is now implicit. Therefore, there is no path to explicitly declare the intermediate table values while seeding.
I've also tried simply assigning the arrays, e.g.,:
public class Book
{
public int BookId { get; set; }
public string Title { get; set; }
public ICollection<Category> Categories { get; set; }
}
public class Category
{
public int CategoryId { get; set; }
public string CategoryName { get; set; }
public ICollection<Book> Books { get; set; }
}
And then at seed time:
Book book = new Book() { BookId = 1, Title = "Brave New World" }
Category category = new Category() { CategoryId = 1, CategoryName = "Dystopian" }
category.Books = new List<Book>() { book };
book.Categories = new List<Category>() { category };
modelBuilder.Entity<Book>().HasData(book);
modelBuilder.Entity<Category>().HasData(category);
... but there are no entries created for BookCategories in the resulting migration. This was somewhat expected, as this article suggests that one must explicitly seed the intermediate table. What I want is something like this:
modelBuilder.Entity<BookCategory>().HasData(
new BookCategory() { BookId = 1, CategoryId = 1 }
);
However, again, since there is no concrete class to describe BookCategories in EFCore5, the only way I can think of to seed the table is to manually edit the migration with additional MigrationBuilder.InsertData commands, which rather defeats the purpose of seeding data via application code.
However, again, since there is no concrete class to describe BookCategories in EFCore5
Actually, as explained in the What's new link, EF Core 5 allows you to have explicit join entity
public class BookCategory
{
public int BookId { get; set; }
public EBook Book { get; set; }
public int CategoryId { get; set; }
public Category Category { get; set; }
}
and configure the many-to-many relationship to use it
modelBuilder.Entity<Book>()
.HasMany(left => left.Categories)
.WithMany(right => right.Books)
.UsingEntity<BookCategory>(
right => right.HasOne(e => e.Category).WithMany(),
left => left.HasOne(e => e.Book).WithMany().HasForeignKey(e => e.BookId),
join => join.ToTable("BookCategories")
);
This way you can use all normal entity operations (query, change tracking, data model seeding etc.) with it
modelBuilder.Entity<BookCategory>().HasData(
new BookCategory() { BookId = 1, CategoryId = 1 }
);
still having the new many-to-many skip navigations mapping.
This is probably the simplest as well as the type-safe approach.
In case you thing it's too much, using the conventional join entity is also possible, but you need to know the shared dictionary entity type name, as well as the two shadow property names. Which as you will see by convention might not be what you expect.
So, by convention the join entity (and table) name is
{LeftEntityName}{RightEntityName}
and the shadow property (and column) names are
{LeftEntityNavigationPropertyName}{RightEntityKeyName}
{RightEntityNavigationPropertyName}{LeftEntityKeyName}
The first question would be - which is the left/right entity? The answer is (not documented yet) - by convention the left entity is the one which name is less in alphabetical order. So with your example Book is left, Category is right, so the join entity and table name would be BookCategory.
It can be changed adding explicit
modelBuilder.Entity<Category>()
.HasMany(left => left.Books)
.WithMany(right => right.Categories);
and now it would be CategoryBook.
In both cases the shadow property (and column) names would be
CategoriesCategoryId
BooksBookId
So neither the table name, nor the property/column names are what you'd normally do.
And apart from the database table/column names, the entity and property names are important because you'd need them for entity operations, including the data seeding in question.
With that being said, even if you don't create explicit join entity, it's better to configure fluently the one created automatically by EF Core convention:
modelBuilder.Entity<Book>()
.HasMany(left => left.Categories)
.WithMany(right => right.Books)
.UsingEntity("BookCategory", typeof(Dictionary<string, object>),
right => right.HasOne(typeof(Category)).WithMany().HasForeignKey("CategoryId"),
left => left.HasOne(typeof(Book)).WithMany().HasForeignKey("BookId"),
join => join.ToTable("BookCategories")
);
Now you can use the entity name to access the EntityTypeBuilder
modelBuilder.Entity("BookCategories")
and you can seed it similar to normal entities with shadow FK properties with anonymous type
modelBuilder.Entity("BookCategory").HasData(
new { BookId = 1, CategoryId = 1 }
);
or for this specific property bag type entity, also with Dictionary<string, object> instances
modelBuilder.Entity("BookCategory").HasData(
new Dictionary<string, object> { ["BookId"] = 1, ["CategoryId"] = 1 }
);
Update:
People seem to misinterpret the aforementioned "extra" steps and find them redundant and "too much", not needed.
I never said they are mandatory. If you know the conventional join entity and property names, go ahead directly to the last step and use anonymous type or Dictionary<string, object>.
I already explained the drawbacks of taking that route - loosing the C# type safety and using "magic" strings out of your control. You have to be smart enough to know the exact EF Core naming conventions and to realize that if you rename class Book to EBook the new join entity/table name will change from "BookCategory" to "CategoryEBook" as well as the order of the PK properties/columns, associated indexes etc.
Regarding the concrete problem with data seeding. If you really want to generalize it (OP attempt in their own answer), at least make it correctly by using the EF Core metadata system rather than reflection and assumptions. For instance, the following will extract these names from the EF Core metadata:
public static void HasJoinData<TFirst, TSecond>(
this ModelBuilder modelBuilder,
params (TFirst First, TSecond Second)[] data)
where TFirst : class where TSecond : class
=> modelBuilder.HasJoinData(data.AsEnumerable());
public static void HasJoinData<TFirst, TSecond>(
this ModelBuilder modelBuilder,
IEnumerable<(TFirst First, TSecond Second)> data)
where TFirst : class where TSecond : class
{
var firstEntityType = modelBuilder.Model.FindEntityType(typeof(TFirst));
var secondEntityType = modelBuilder.Model.FindEntityType(typeof(TSecond));
var firstToSecond = firstEntityType.GetSkipNavigations()
.Single(n => n.TargetEntityType == secondEntityType);
var joinEntityType = firstToSecond.JoinEntityType;
var firstProperty = firstToSecond.ForeignKey.Properties.Single();
var secondProperty = firstToSecond.Inverse.ForeignKey.Properties.Single();
var firstValueGetter = firstToSecond.ForeignKey.PrincipalKey.Properties.Single().GetGetter();
var secondValueGetter = firstToSecond.Inverse.ForeignKey.PrincipalKey.Properties.Single().GetGetter();
var seedData = data.Select(e => (object)new Dictionary<string, object>
{
[firstProperty.Name] = firstValueGetter.GetClrValue(e.First),
[secondProperty.Name] = secondValueGetter.GetClrValue(e.Second),
});
modelBuilder.Entity(joinEntityType.Name).HasData(seedData);
}
Also here you don't need to know which type is "left" and which is "right", neither requires special base class or interface. Just pass sequence of entity pairs and it will properly seed the conventional join entity, e.g. with OP example, both
modelBuilder.HasJoinData((book, category));
and
modelBuilder.HasJoinData((category, book));
would do.
Update (EF Core 5.0.2)
It's working well using the name of the associative table:
builder.Entity("ContractDeclarationType").HasData(
new { ContractsId = 1L, DeclarationTypesId = 1L },
new { ContractsId = 1L, DeclarationTypesId = 2L },
new { ContractsId = 1L, DeclarationTypesId = 3L });
I ended up whipping up a generic solution to this problem based upon the answer from Ivan (thanks!). I'm now able to seed all my M2M tables with this syntax:
// Add book1 and book2 to category1:
modelBuilder.HasM2MData(new [] { book1, book2 }, new [] { category1 });
This may not be fully robust, but it should work with conventional M2M mappings.
It makes some assumptions:
T1 & T2 Inherit from some ModelBase that provides an Id property.
T1 & T2 Have exactly one ICollection<OtherType> property.
You know the correct order (which model is T1 and which is T2) — this can be discovered by running the migration for the tables first and inspecting the migration.
You're running EFCore5 RC2 or later (see this issue).
public static void HasM2MData<T1, T2>
(this ModelBuilder mb, T1[] t1s, T2[] t2s)
where T1 : ModelBase where T2 : ModelBase
{
string table = $"{typeof(T1).Name}{typeof(T2).Name}";
PropertyInfo t1Prop = GetM2MProperty<T1, T2>();
PropertyInfo t2Prop = GetM2MProperty<T2, T1>();
string t1Key = $"{t1Prop.Name}Id";
string t2Key = $"{t2Prop.Name}Id";
foreach (T1 t1 in t1s) {
foreach (T2 t2 in t2s) {
mb.Entity(table).HasData(new Dictionary<string, object>() { [t2Key] = t1.Id, [t1Key] = t2.Id });
}
}
}
// Get a property on T1 which is assignable to type ICollection<T2>, representing the m2m relationship
private static PropertyInfo GetM2MProperty<T1, T2>() {
Type assignableType = typeof(ICollection<T2>);
List<PropertyInfo> props = typeof(T1).GetProperties()
.Where(pi => pi.PropertyType.IsAssignableTo(assignableType))
.ToList();
if (props.Count() != 1) {
throw new SystemException(
$"Expected {typeof(T1)} to have exactly one column of type {assignableType}; got: {props.Count()}");
}
return props.First();
}
In the migration, we see something like:
migrationBuilder.InsertData(
table: "BookCategory",
columns: new[] { "BooksId", "CategoriesId" },
values: new object[,]
{
{ "book1", "category1" },
{ "book2", "category1" }
});

Get random entries in firebase real-time database

This is my code to get 5 items from realtime database:
val database = FirebaseDatabase.getInstance()
val brandReference = database.getReference("brandGame").limitToFirst(5)
brandReference.addValueEventListener(object : ValueEventListener {
override fun onDataChange(dataSnapshot: DataSnapshot) {
dataSnapshot.children.forEach {
...
}
}
}
And this is how my real-time database looks like:
What's the best way to get 5 items randomly? I know there isn't a random function in real time database yet.
If you know the number of elements in the brandGame/-reference, you could pick 5 random numbers between 1 and numberOfElements and retrieve those. This would result in multiple calls to the database.
Alternatively, you could download everything from the brandGame/-reference and just pick 5 random elements using pure Kotlin. But then you must download everything in the reference, which could be a lot.
The best option is to set up a cloud function that does the "pick 5 random options"-logic server side. https://firebase.google.com/docs/functions/ But this requires that you write some js :)
As you say, there is no built-in way to get random elements from a reference.
To get a random brand, please use the following code user side:
val rootRef = FirebaseDatabase.getInstance().reference
val brandGameRef = rootRef.child("brandGame")
val valueEventListener = object : ValueEventListener {
override fun onDataChange(dataSnapshot: DataSnapshot) {
val brandCountList = ArrayList<String>()
for (ds in dataSnapshot.children) {
val brand = ds.child("brand").getValue(String::class.java)
brandCountList.add(brand!!)
}
val brandCount = brandCountList.size
val randomNumber = Random().nextInt(brandCount)
val randomBrand = ArrayList<String>()
randomBrand.add(brandCountList.get(randomNumber)) //Add the brand product to list
val arrayAdapter = ArrayAdapter(applicationContext, android.R.layout.simple_list_item_1, randomBrand)
list_view.adapter = arrayAdapter
}
override fun onCancelled(databaseError: DatabaseError) {
//Handle exceptions
}
}
brandGameRef.addListenerForSingleValueEvent(valueEventListener)

Tornadofx Javafx - How to reload a view / component

So its a basic question.
What I am trying to achieve is refreshing views from another views.
Lets say I have a view EmployeeTableView which shows a tabular representation of employees by doing a REST API call.
In another view, I have a the filter EmployeeFilterView wherein I have gender, salary range, employee type, etc.
I also have a userContext object in which I store the user preferences. So by default lets say I have stored the value of gender filter to be Male, salary range to be ALL, etc. This object is send as a parameter to the EmployeeTableView.
When the EmployeeTableView is loaded I do a restAPI call with the userContext values to get the employee details. So that works fine. Now I change the gender filter to Female and assign this value in my userContext.
Now if I could just reload the EmployeeTableView with the userContext object, the restapi call would get the updated values.
But how can I do that ?
Also suggest a better approach if you have.
The EventBus is one valid solution to this. Another would be to use a ViewModel or Controller as the UserContext object and let that include the actual observable list of employees and then bind that list to the TableView in EmployeeTableView. Whenever the list in the context is updated, the TableView will update as well.
The filter view would call a function in the UserContext to perform the actual REST call and update the list of employees based on that.
You could create a separate EmployeeQuery object that can be injected into both the EmployeeFilterView and the UserContext so it can extract the selected filter values to perform the query. This query object contains a list of all the search parameters you want to pass to the server.
You could also consider creating a separate scope to keep these components separated if that makes sense to your architecture.
Exactly how you define these components are mostly a matter of taste, here is one suggestion. I used the RangeSlider from ControlsFX for the mock search UI.
To make it easier to imagine how this ties together, here is a screenshot:
(All names and salaries are fiction :)
/**
* The employee domain model, implementing JsonModel so it can be fetched
* via the REST API
*/
class Employee : JsonModel {
val nameProperty = SimpleStringProperty()
var name by nameProperty
val salaryProperty = SimpleIntegerProperty()
var salary by salaryProperty
val genderProperty = SimpleObjectProperty<Gender>()
var gender by genderProperty
override fun updateModel(json: JsonObject) {
with (json) {
name = getString("name")
salary = getInt("salary")
gender = Gender.valueOf(getString("gender"))
}
}
}
enum class Gender { Male, Female }
/**
* Container for the list of employees as well as a search function called by the filter
* view whenever it should update the employee list.
*/
class EmployeeContext : Controller() {
val api: Rest by inject()
val query: EmployeeQuery by inject()
val employees = SimpleListProperty<Employee>()
fun search() {
runAsync {
FXCollections.observableArrayList(Employee().apply {
name = "Edvin Syse"
gender = Gender.Male
salary = 200_000
})
//api.post("employees/query", query).list().toModel<Employee>()
} ui {
employees.value = it
}
}
}
/**
* Query object used to define the query sent to the server
*/
class EmployeeQuery : ViewModel(), JsonModel {
val genderProperty = SimpleObjectProperty<Gender>(Gender.Female)
var gender by genderProperty
val salaryMinProperty = SimpleIntegerProperty(50_000)
var salaryMin by salaryMinProperty
val salaryMaxProperty = SimpleIntegerProperty(250_000)
var salaryMax by salaryMaxProperty
val salaryDescription = stringBinding(salaryMinProperty, salaryMaxProperty) {
"$$salaryMin - $$salaryMax"
}
override fun toJSON(json: JsonBuilder) {
with(json) {
add("gender", gender.toString())
add("salaryMin", salaryMin)
add("salaryMax", salaryMax)
}
}
}
/**
* The search/filter UI
*/
class EmployeeFilterView : View() {
val query: EmployeeQuery by inject()
val context: EmployeeContext by inject()
override val root = form {
fieldset("Employee Filter") {
field("Gender") {
combobox(query.genderProperty, Gender.values().toList())
}
field("Salary Range") {
vbox {
alignment = Pos.CENTER
add(RangeSlider().apply {
max = 500_000.0
lowValueProperty().bindBidirectional(query.salaryMinProperty)
highValueProperty().bindBidirectional(query.salaryMaxProperty)
})
label(query.salaryDescription)
}
}
button("Search").action {
context.search()
}
}
}
}
/**
* The UI that shows the search results
*/
class EmployeeTableView : View() {
val context: EmployeeContext by inject()
override val root = borderpane {
center {
tableview(context.employees) {
column("Name", Employee::nameProperty)
column("Gender", Employee::genderProperty)
column("Salary", Employee::salaryProperty)
}
}
}
}
/**
* A sample view that ties the filter UI and result UI together
*/
class MainView : View("Employee App") {
override val root = hbox {
add(EmployeeFilterView::class)
add(EmployeeTableView::class)
}
}
I ended up using Tornadofx -> EventBus
Basically, when I change any of the filters, I fire an even which rebuilds the Node with the updated values.
Not sure whether the approach is right, that's why still keeping it open for discussion.

AutoMapper: Can't access original object instances passed to Map() from within AfterMap()

I have code like this:
//Fields
Product _prod, _existingProd;
void Test()
{
_prod = MakeAndPopulateSomeRandomProduct();
_existingProd = GetProdFromDb(1);
Mapper.CreateMap()
.AfterMap((s, d) =>
{
Console.WriteLine(d==_existingProd); //Why does this print false?
//Customize other properties on destination object
});
Mapper.Map(_prod, _existingProd);
}
When I call Test(), false is printed but I expected true. In my scenario, it is important to be able to access the original destination object via the AfterMap argument. I only included the fields to demonstrate the problem but in my real code, I don't have direct access to them. How can I access the object instances passed in to Map() when customizing the mapping?
The following example works. Probably you are using some type converter which creates new instance... Also please provide all mapping configurations to better understand the problem.
[TestFixture]
public class AfterMap_Test
{
//Fields
private Product _prod, _existingProd;
[Test]
public void Test()
{
Mapper.CreateMap<Product, Product>()
.AfterMap((s, d) =>
{
Trace.WriteLine(d == _existingProd); //Why does this print false?
//Customize other properties on destination object
});
_existingProd = new Product {P1 = "Destination"};
_prod = new Product {P1 = "Source"};
Mapper.Map(_prod, _existingProd);
}
}
internal class Product
{
public string P1 { get; set; }
}

Can a JPA Query return results as a Java Map?

We are currently building a Map manually based on the two fields that are returned by a named JPA query because JPA 2.1 only provides a getResultList() method:
#NamedQuery{name="myQuery",query="select c.name, c.number from Client c"}
HashMap<Long,String> myMap = new HashMap<Long,String>();
for(Client c: em.createNamedQuery("myQuery").getResultList() ){
myMap.put(c.getNumber, c.getName);
}
But, I feel like a custom mapper or similar would be more performant since this list could easily be 30,000+ results.
Any ideas to build a Map without iterating manually.
(I am using OpenJPA, not hibernate)
Returning a Map result using JPA Query getResultStream
Since the JPA 2.2 version, you can use the getResultStream Query method to transform the List<Tuple> result into a Map<Integer, Integer>:
Map<Integer, Integer> postCountByYearMap = entityManager.createQuery("""
select
YEAR(p.createdOn) as year,
count(p) as postCount
from
Post p
group by
YEAR(p.createdOn)
""", Tuple.class)
.getResultStream()
.collect(
Collectors.toMap(
tuple -> ((Number) tuple.get("year")).intValue(),
tuple -> ((Number) tuple.get("postCount")).intValue()
)
);
Returning a Map result using JPA Query getResultList and Java stream
If you're using JPA 2.1 or older versions but your application is running on Java 8 or a newer version, then you can use getResultList and transform the List<Tuple> to a Java 8 stream:
Map<Integer, Integer> postCountByYearMap = entityManager.createQuery("""
select
YEAR(p.createdOn) as year,
count(p) as postCount
from
Post p
group by
YEAR(p.createdOn)
""", Tuple.class)
.getResultList()
.stream()
.collect(
Collectors.toMap(
tuple -> ((Number) tuple.get("year")).intValue(),
tuple -> ((Number) tuple.get("postCount")).intValue()
)
);
Returning a Map result using a Hibernate-specific ResultTransformer
Another option is to use the MapResultTransformer class provided by the Hibernate Types open-source project:
Map<Number, Number> postCountByYearMap = (Map<Number, Number>) entityManager.createQuery("""
select
YEAR(p.createdOn) as year,
count(p) as postCount
from
Post p
group by
YEAR(p.createdOn)
""")
.unwrap(org.hibernate.query.Query.class)
.setResultTransformer(
new MapResultTransformer<Number, Number>()
)
.getSingleResult();
The MapResultTransformer is suitable for projects still running on Java 6 or using older Hibernate versions.
Avoid returning large result sets
The OP said:
But, I feel like a custom mapper or similar would be more performant
since this list could easily be 30,000+ results.
This is a terrible idea. You never need to select 30k records. How would that fit in the UI? Or, why would you operate on such a large batch of records?
You should use query pagination as this will help you reduce the transaction response time and provide better concurrency.
There is no standard way to get JPA to return a map.
see related question: JPA 2.0 native query results as map
Iterating manually should be fine. The time to iterate a list/map in memory is going to be small relative to the time to execute/return the query results. I wouldn't try to futz with the JPA internals or customization unless there was conclusive evidence that manual iteration was not workable.
Also, if you have other places where you turn query result Lists into Maps, you probably want to refactor that into a utility method with a parameter to indicate the map key property.
You can retrieve a list of java.util.Map.Entry instead.
Therefore the collection in your entity should be modeled as a Map:
#OneToMany
#MapKeyEnumerated(EnumType.STRING)
public Map<PhoneType, PhoneNumber> phones;
In the example PhoneType is a simple enum, PhoneNumber is an entity. In your query use the ENTRY keyword that was introduced in JPA 2.0 for map operations:
public List<Entry> getPersonPhones(){
return em.createQuery("SELECT ENTRY(pn) FROM Person p JOIN p.phones pn",java.util.Map.Entry.class).getResultList();
}
You are now ready to retrieve the entries and start working with it:
List<java.util.Map.Entry> phoneEntries = personDao.getPersonPhoneNumbers();
for (java.util.Map.Entry<PhoneType, PhoneNumber> entry: phoneEntries){
//entry.key(), entry.value()
}
If you still need the entries in a map but don't want to iterate through your list of entries manually, have a look on this post Convert Set<Map.Entry<K, V>> to HashMap<K, V> which works with Java 8.
This works fine.
Repository code :
#Repository
public interface BookRepository extends CrudRepository<Book,Id> {
#Query("SELECT b.name, b.author from Book b")
List<Object[]> findBooks();
}
service.java
List<Object[]> list = bookRepository.findBooks();
for (Object[] ob : list){
String key = (String)ob[0];
String value = (String)ob[1];
}
link https://codereview.stackexchange.com/questions/1409/jpa-query-to-return-a-map
Map<String,Object> map = null;
try {
EntityManager entityManager = getEntityManager();
Query query = entityManager.createNativeQuery(sql);
query.setHint(QueryHints.RESULT_TYPE, ResultType.Map);
map = (Map<String,Object>) query.getSingleResult();
}catch (Exception e){ }
List<Map<String,Object>> list = null;
try {
EntityManager entityManager = getEntityManager();
Query query = entityManager.createNativeQuery(sql);
query.setHint(QueryHints.RESULT_TYPE, ResultType.Map);
list = query.getResultList();
}catch (Exception e){ }
JPA v2.2
Though I am late here, but if someone reaches here for solution, here is my custom working solution for multiple selected columns with multiple rows:
Query query = this.entityManager.createNativeQuery("SELECT abc, xyz, pqr,...FROM...", Tuple.class);
.
.
.
List<Tuple> lst = query.getResultList();
List<Map<String, Object>> result = convertTuplesToMap(lst);
Implementation of convertTuplesToMap():
public static List<Map<String, Object>> convertTuplesToMap(List<Tuple> tuples) {
List<Map<String, Object>> result = new ArrayList<>();
for (Tuple single : tuples) {
Map<String, Object> tempMap = new HashMap<>();
for (TupleElement<?> key : single.getElements()) {
tempMap.put(key.getAlias(), single.get(key));
}
result.add(tempMap);
}
return result;
}
in case java 8
there built in entry "CustomEntryClass"
since return is stream, then caller function (repoistory layer) must have #Transactional(readonly=true|false) annotation, otherwithe exception will be thrown
make sure you will use full qualified name of class CustomEntryClass...
#Query("select new CustomEntryClass(config.propertyName, config.propertyValue) " +
"from ClientConfigBO config where config.clientCode =:clientCode ")
Stream<CustomEntryClass<String, String>> getByClientCodeMap(#Param("clientCode") String clientCode);
With custom result class and a bit of Guava, this is my approach which works quite well:
public static class SlugPair {
String canonicalSlug;
String slug;
public SlugPair(String canonicalSlug, String slug) {
super();
this.canonicalSlug = canonicalSlug;
this.slug = slug;
}
}
...
final TypedQuery<SlugPair> query = em.createQuery(
"SELECT NEW com.quikdo.core.impl.JpaPlaceRepository$SlugPair(e.canonicalSlug, e.slug) FROM "
+ entityClass.getName() + " e WHERE e.canonicalSlug IN :canonicalSlugs",
SlugPair.class);
query.setParameter("canonicalSlugs", canonicalSlugs);
final Map<String, SlugPair> existingSlugs =
FluentIterable.from(query.getResultList()).uniqueIndex(
new Function<SlugPair, String>() {
#Override #Nullable
public String apply(#Nullable SlugPair input) {
return input.canonicalSlug;
}
});
using java 8 (+) you can get results as a list of array object (each column will from select will have same index on results array) by hibernate entity manger, and then from results list into stream, map results into entry (key, value), then collect them into map of same type.
final String sql = "SELECT ID, MODE FROM MODES";
List<Object[]> result = entityManager.createNativeQuery(sql).getResultList();
return result.stream()
.map(o -> new AbstractMap.SimpleEntry<>(Long.valueOf(o[0].toString()), String.valueOf(o[1])))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
The easiest and simplest way worked for me is:
String[] columns = {"id","name","salary","phone","address", "dob"};
String query = "SELECT id,name,salary,phone,address,dob from users ";
List<Object[]> queryResp = em.createNativeQuery(query).getResultList();
List<Map<String,String>> dataList = new ArrayList<>();
for(Object[] obj : queryResp) {
Map<String,String> row = new HashMap<>(columns.length);
for(int i=0; i<columns.length; i++) {
if(obj[i]!=null)
row.put(columns[i], obj[i].toString());
else
row.put(columns[i], "");
}
dataList.add(row);
}
Please refer, JPA 2.0 native query results as map
In your case in Postgres, it would be something like,
List<String> list = em.createNativeQuery("select cast(json_object_agg(c.number, c.name) as text) from schema.client c")
.getResultList();
//handle exception here, this is just sample
Map map = new ObjectMapper().readValue(list.get(0), Map.class);
Kindly note, I am just sharing my workaround with Postgres.
How about this ?
#NamedNativeQueries({
#NamedNativeQuery(
name="myQuery",
query="select c.name, c.number from Client c",
resultClass=RegularClient.class
)
})
and
public static List<RegularClient> runMyQuery() {
return entityManager().createNamedQuery("myQuery").getResultList();
}

Resources