dynamic consumer group in kafka listeners combined with RecordFilterStrategy - spring-kafka

I would like to create different consumer groups dynamically for my kafka listener, therefore I have given up the standard way with #KafkaListener and went throw the manual
way with a ContainerProperties which is working fine like the example below.
However I can't use setRecordFilterStrategy because it is only present in the kafka listener
version ConcurrentKafkaListenerContainerFactory.
Do you know a way to do it or another to have dynamic consumer group and the possibility to use record filter strategy at the same time ?
Regards,
Luc
Map<String, Object> consumerConfig = ImmutableMap.of(
BOOTSTRAP_SERVERS_CONFIG, "brokerAddress",
GROUP_ID_CONFIG, "groupId"
);
DefaultKafkaConsumerFactory<String, String> kafkaConsumerFactory =
new DefaultKafkaConsumerFactory<>(
consumerConfig,
new StringDeserializer(),
new StringDeserializer());
ContainerProperties containerProperties = new ContainerProperties("topicName");
containerProperties.setMessageListener((MessageListener<String, String>) record -> {
//do something with received record
}
ConcurrentMessageListenerContainer container =
new ConcurrentMessageListenerContainer<>(
kafkaConsumerFactory,
containerProperties);
container.start();

See FilteringMessageListenerAdapter. So, you provide your MessageListener and some RecordFilterStrategy.
More in docs: https://docs.spring.io/spring-kafka/docs/current/reference/html/#filtering-messages

Related

How to add a Group inside another group in AEM?

I am trying to add a group as a member to another group in AEM using workflow, but it is not adding and moreover it is not throwing any error.
Map<String, Object> subServiceParameters = new HashMap<>();
subServiceParameters.put(ResourceResolverFactory.SUBSERVICE, GlobalConstants.SERVICE_USER_MAPPER_NAME);
ResourceResolver resourceResolver = resourceResolverFactory.getServiceResourceResolver(subServiceParameters);
UserManager userManager = workflowSession.adaptTo(ResourceResolver.class).adaptTo(UserManager.class);
Group group = (Group) userManager.getAuthorizable("COMPANY_ADMINISTRATORS");
group.addMember(userManager.getAuthorizable("COMPANY_TAG_ADMINISTRATORS"));
Session session = resourceResolver.adaptTo(Session.class);
session.save();
I was using workflowsession to create usermanager object whereas i was saving the jcr session, because of this my code was not working. Please use the below line for user manager rest all working fine.
UserManager userManager = resourceResolver.adaptTo(UserManager.class);

Workfront: Set predecessor when adding a task

I'm using the java and the Workfront API to create a new task and would like to specify a predecessor while creating the task. If I add the task first, then update it, I am able to add the predecessor but I'd prefer to specify the predecessor when adding the task if possible.
Here's what I've tried but I have no luck. I just get a 500 error (internal server error).
...
Map<String, Object> map = new HashMap<String, Object>();
map.put( "name", "Test Task" );
map.put( "projectID", projectId );
JSONArray array = new JSONArray();
JSONObject jsonObj = new JSONObject();
jsonObj.put( "predecessorID", predecessorId );
jsonObj.put( "predecessorType", "fs" );
array.put( jsonObj );
map.put( "predecessors", array );
client.post( "task", map );
Has anyone been able to do this? Am I just missing something?
I'm almost positive that you can't set dependencies in the same operation as object creation. You'll have to PUT an update on the object after its creation. I'm not sure about the syntax for your Java implementation, but this is what the raw HTTP call would look like:
PUT https://<url>.my.workfront.com/attask/api/v9.0/task/<uuid>?updates={predecessors:[{predecessorID:"<ID of dep>",predecessorType:"<ss/sf/fs/ff>"}]}&apiKey=<key>

Update operation in DynamoDB

Here is a question raised during our data structure design. Currently, we have a data as the following:
class {
private String id;
private Map<String, Object> map;
...
}
We will have an operation to add new entries to the map frequently. I am wondering whether the data needs to be fetched from DynamoDB first, add new entries to the map and do a DB update with the modified data or not? (That is how Mongo update works.) If not, the data structure design isn't good.
Yes, you need to fetch the data from DynamoDB first and then add a new entry to map and save the object.
If the object is of type List, then you can append values to the existing list.
List<String> nameList = new ArrayList<>();
nameList.add("1");
UpdateItemSpec updateItemSpec = new UpdateItemSpec().withPrimaryKey("Id", testId).withReturnValues(ReturnValue.ALL_NEW).withUpdateExpression("set #Names = list_append (#Names, :val1)").withNameMap(new NameMap().with("#Names", "Names"))
.withValueMap(new ValueMap().withList(":val1", nameList));
UpdateItemOutcome outcome = table.updateItem(updateItemSpec);

WCF Transaction with multiple inserts

When creating a user, entries are required in multiple tables. I am trying to create a transaction that creates a new entry into one table and then pass the new entityid into the parent table and so on. The error I am getting is
The transaction manager has disabled its support for remote/network
transactions. (Exception from HRESULT: 0x8004D024)
I believe this is caused by creating multiple connections within a single TransactionScope, but I am unsure on what the best/most efficient way of doing this is.
[OperationBehavior(TransactionScopeRequired = true)]
public int CreateUser(CreateUserData createData)
{
// Create a new family group and get the ID
var familyGroupId = createData.FamilyGroupId ?? CreateFamilyGroup();
// Create the APUser and get the Id
var apUserId = CreateAPUser(createData.UserId, familyGroupId);
// Create the institution user and get the Id
var institutionUserId = CreateInsUser(apUserId, createData.AlternateId, createData.InstitutionId);
// Create the investigator group user and return the Id
return AddUserToGroup(createData.InvestigatorGroupId, institutionUserId);
}
This is an example of one of the function calls, all the other ones follow the same format
public int CreateFamilyGroup(string familyGroupName)
{
var familyRepo = _FamilyRepo ?? new FamilyGroupRepository();
var familyGroup = new FamilyGroup() {CreationDate = DateTime.Now};
return familyRepo.AddFamilyGroup(familyGroup);
}
And the repository call for this is as follows
public int AddFamilyGroup(FamilyGroup familyGroup)
{
using (var context = new GameDbContext())
{
var newGroup = context.FamilyGroups.Add(familyGroup);
context.SaveChanges();
return newGroup.FamilyGroupId;
}
}
I believe this is caused by creating multiple connections within a single TransactionScope
Yes, that is the problem. It does not really matter how you avoid that as long you avoid it. A common thing to do is to have one connection and one EF context per WCF request. You need to find a way to pass that EF context along.
The method AddFamilyGroup illustrates a common anti-pattern with EF: You are using EF as a CRUD facility. It's supposed to me more like a live object graph connected to the database. The entire WCF request should share the same EF context. If you move in that direction the problem goes away.

Alfresco error: Model '{custom.model}custommodel' does not exist

Using Alfresco 5.0 community edition.
When trying to deploy the custom model provided in the answer to another question,
but using the dynamic deployment approach as specified at [http://docs.alfresco.com/5.0/tasks/deploy-dynamic.html]
Although the GUI says the model is "activated", I get the following WARN in the alfresco.log:
21:24:30,587 WARN [org.alfresco.repo.dictionary.DictionaryDAO] [ajp-apr-8009-exec-4]
org.alfresco.service.cmr.dictionary.DictionaryException:
00140008 Model '{custom.model}custommodel' does not exist
When I try to use it with CMIS 1.1, I'm getting an error back from the web service:
Type 'P:cmod:customDoc' is unknown!
Here is the relevant bit of the code which uses opencmis java api:
Map<String, Object> props = new HashMap<String, Object>();
props.put("cmis:objectTypeId", "cmis:document");
props.put("cmis:secondaryObjectTypeIds", Arrays.asList(new String[] {"P:cm:titled", "P:cmod:customDoc"}));
props.put("cmis:name", "myName");
props.put("cmod:property1", "value1");
ObjectId id = folder.createDocument(props, contentStream, VersioningState.MAJOR);
Am I specifying the namespace and aspect correctly (P:cmod:customDoc)? I've also tried cmod:aspectBase and other combinations, getting the same error.
My goal is to make a simple model where I can add a few extra fields to document objects (extending the default ContentModel).
It seems the warning is just that, it can be ignored.
But using CMIS 1.1, I have to do two steps to add in the extra properties from the custom aspect. (Trying to do it in one step gives the error "Type 'P:cmod:customDoc' is unknown!")
First createDocument() with the cmis:secondaryObjectTypeIds including the custom namespace, BUT don't add any custom properties.
Second, add the custom properties to the resulting document, and then updateProperties(). This will add in the custom property values to the document.
Map<String, Object> props = new HashMap<String, Object>();
props.put("cmis:objectTypeId", "cmis:document");
props.put("cmis:secondaryObjectTypeIds",
Arrays.asList(new String[] {"P:cm:titled", "P:cmod:customDoc"}));
props.put("cmis:name", "myName");
Document document = folder.createDocument(props, contentStream, VersioningState.MAJOR);
props = new HashMap<String, Object>();
props.put("cmod:property1", "value1"); //here is where you add the custom properties
document = (Document) document.updateProperties(properties);
(note: you need to reassign the document from the updateProperties result, otherwise it will be missing some information, such as parents)

Resources