Cometd with Spring-MVC for personalized chatting - spring-mvc

I am working in a Spring-MVC application and I would like to include personalized chat as a feature in it. After some research I found out Cometd to be a suitable option. After going through the documentation and forever repeating samples, I have a little bit of setup which I have done. I need some help to integrate a personalized chat service in the spring-mvc app, and enabling private chat when user pushes chat button.
So basically, I found out, "/service/chat" can be used for private chat, so I have a class for that, and to use private chat, I must have a mapping of userid<-->sessionId, but I cannot find examples anywhere how to do it. I am posting some of the code I have, kindly let me know what is remaining to do, and if possible, some resources, samples for that.
Controller code:
#Controller
#Singleton
public class MessageController {
private MessageService messageService;
#Autowired(required = true)
#Qualifier(value ="messageService")
public void setMessageService(MessageService messageService){this.messageService=messageService;}
#RequestMapping(value = "/startchatting", produces = "application/text")
#ResponseBody
public String startChattingService(){
return "OK";
}
#RequestMapping(value = "/stopchatting",produces = "application/text")
#ResponseBody
public String stopChatting(){
return "OK";
}
}
Private Message Service :
#Service
public class PrivateMessageService {
#Session
private ServerSession session;
#Listener("/service/private")
public void handlePrivateMessage(ServerSession sender, ServerMessage message){
String userId = (String) message.get("targetUserId");
//Mapping code necessary to map userids to session-id's.
//ServerSession recipient = findServerSessionFromUserId(userId);
//recipient.deliver(session,message.getChannel(),message.getData(),null);
}
}
CometConfigurer :
#Component
#Singleton
public class CometConfigurer {
private BayeuxServer bayeuxServer;
private ServerAnnotationProcessor processor;
#Inject
public void setBayeuxServer(BayeuxServer bayeuxServer){this.bayeuxServer = bayeuxServer;}
#PostConstruct
public void init() {this.processor= new ServerAnnotationProcessor(bayeuxServer);}
public Object postProcessBeforeInitialization(Object bean, String name) throws BeansException {
System.out.println("Configuring service " + name);
processor.processDependencies(bean);
processor.processConfigurations(bean);
processor.processCallbacks(bean);
return bean;
}
public Object postProcessAfterInitialization(Object bean, String name) throws BeansException {
return bean;
}
public void postProcessBeforeDestruction(Object bean, String name) throws BeansException {
processor.deprocessCallbacks(bean);
}
#Bean(initMethod = "start", destroyMethod = "stop")
public BayeuxServer bayeuxServer() {
BayeuxServerImpl bean = new BayeuxServerImpl();
// bean.setOption(BayeuxServerImpl.LOG_LEVEL, "3");
return bean;
}
public void setServletContext(ServletContext servletContext) {
servletContext.setAttribute(BayeuxServer.ATTRIBUTE, bayeuxServer);
}
}
Cometd beans :
<beans:bean id="bayeuxServer" class="org.cometd.server.BayeuxServerImpl" init-method="start" destroy-method="stop"/>
I have directly included the JSP files which have cometd configuration and setup from https://github.com/fredang/cometd-spring-example, and modified them to serve my needs. Kindly let me know what else is remaining, all suggestions are welcome, I am unable to find any examples for same task on net, which are detailed, and have more code then explanation. Thank you.

Using Spring 4.x's new WebSocket feature would definitely work; moreover, this new module ships with lots of very interesting features for your use case:
STOMP protocol support
messaging abstractions
session management
pub/sub mechanisms
etc
You can check this nice chat application that demonstrates all those features.

Related

Valdiate pojo using #Valid in sping cloud streams

How can one enable validation using #Valid inside the following kafka consumer code ? I am using Spring Cloud Stream (Kafka Stream binder implementation), and there after my implemention is using functional model for example.
#Bean
public Consumer<KStream<String, #Valid Pojo>> process() {
return messages -> messages.foreach((k, v) -> process(v));
}
I tried the following but it didn't work....
#Bean
public DefaultMessageHandlerMethodFactory configureMessageHandlerMethodFactory(
DefaultMessageHandlerMethodFactory messageHandlerMethodFactory,
LocalValidatorFactoryBean validatorFactoryBean) {
messageHandlerMethodFactory.setValidator(validatorFactoryBean);
return messageHandlerMethodFactory;
}
This is simple in spring-kafka by implementing KafkaListenerConfigurer and setting LocalValidatorFactoryBean on KafkaListenerEndpointRegistrar
public class KafkaConfiguration implements KafkaListenerConfigurer {
#Override
public void configureKafkaListeners(KafkaListenerEndpointRegistrar registrar) {
registrar.setValidator(validatorFactoryBean);
}
.....
This is not supported in the functional model at the moment. Even for a non-functional scenario, this is non-trivial for types like KStream. The KafkaListenerConfigurer you mentioned above is for regular Kafka Support with a message channel binder. Your best options for Kafka Streams binder are either using some custom validation in the function itself before continuing with the processing or introducing a schema registry and then perform a schema validation before passing the record to the function.
You can follow the recommendation to create a bean that respects the functional interface of java, that is, it has only a public method, for example:
#Validated
#Component
public class Processor implements Consumer<KStream<String, Pojo>> {
#Override
public void accept(final #Valid #NotNull KStream<String, Pojo> stream) {
stream.foreach((k, v) -> process(v));
}
private void process(final Pojo v) {
}
}
So that generates an execution:
javax.validation.ConstraintDeclarationException: HV000151: A method
overriding another method must not reset the parameter constraint
configuration
It is not possible to overwrite the parameters of the accept method of the consumer functional interface so just remove the interface and leave the component like this:
#Validated
#Component
public class Processor {
public void accept(final #Valid #NotNull KStream<String, Pojo> stream) {
stream.foreach((k, v) -> process(v));
}
private void process(final Pojo v) {
}
}
The problem is that the spring cloud function will not recognize the bean for not extending one of the functional classes.
the workaround I got was:
#RequiredArgsConstructor
public abstract class ValidatedEventListener<T> implements Consumer<T> {
private final Validator validator;
#Override
public void accept(final T t) {
validate(t);
listen(t);
}
public abstract void listen(final T t);
public void validate(final Object event) {
var violations = validator.validate(event);
if (!violations.isEmpty()) throw new ConstraintViolationException(violations);
}
}

How to test a kafka consumer against a real kafka broker running on a server?

I have difficulty understanding some Kafka concepts in Java Spring Boot. I’d like to test a consumer against a real Kafka broker running on a server, which has some producers that write / have already written data to various topics. I would like to establish a connection with the server, consume the data, and verify or process its content in a test.
An enormous majority of examples (actually all I have seen so far) in the internet refer to embedded kafka, EmbeddedKafkaBroker, and show both a producer and a consumer implemented on one machine, locally. I haven’t found any example that would explain how to make a connection with a remote kafka server and read data from a particular topic.
I've written some code and I've printed the broker address with:
System.out.println(embeddedKafkaBroker.getBrokerAddress(0));
What I got is 127.0.0.1:9092, which means that it is local, so the connection with the remote server has not been established.
On the other hand, when I run the SpringBootApplication I get the payload from the remote broker.
Receiver:
#Component
public class Receiver {
private static final String TOPIC_NAME = "X";
private static final Logger LOGGER = LoggerFactory.getLogger(Receiver.class);
private CountDownLatch latch = new CountDownLatch(1);
public CountDownLatch getLatch() {
return latch;
}
#KafkaListener(topics = TOPIC_NAME)
public void receive(final byte[] payload) {
LOGGER.info("received the following payload: '{}'", payload);
latch.countDown();
}
}
Config:
#EnableKafka
#Configuration
public class ByteReceiverConfig {
#Autowired
EmbeddedKafkaBroker kafkaEmbeded;
#Value("${spring.kafka.bootstrap-servers}")
private String bootstrapServers;
#Value("${spring.kafka.consumer.group-id}")
private String groupIdConfig;
#Bean
public KafkaListenerContainerFactory<?> kafkaListenerContainerFactory() {
final ConcurrentKafkaListenerContainerFactory<Object, Object> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
#Bean
ConsumerFactory<Object, Object> consumerFactory() {
return new DefaultKafkaConsumerFactory<>(consumerProperties());
}
#Bean
Map<String, Object> consumerProperties() {
final Map<String, Object> properties =
KafkaTestUtils.consumerProps("junit-test", "true", this.kafkaEmbeded);
properties.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, ByteArrayDeserializer.class);
properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ByteArrayDeserializer.class);
properties.put(ConsumerConfig.GROUP_ID_CONFIG, groupIdConfig);
return properties;
}
Test:
#EnableAutoConfiguration
#EnableKafka
#SpringBootTest(classes = {ByteReceiverConfig.class, Receiver.class})
#EmbeddedKafka
#ContextConfiguration(classes = ByteReceiverConfig.class)
#TestPropertySource(properties = { "spring.kafka.bootstrap-servers=${spring.embedded.kafka.brokers}",
"spring.kafka.consumer.group-id=EmbeddedKafkaTest"})
public class KafkaTest {
#Autowired
private KafkaListenerEndpointRegistry kafkaListenerEndpointRegistry;
#Autowired
EmbeddedKafkaBroker embeddedKafkaBroker;
#Autowired
Receiver receiver;
#BeforeEach
void waitForAssignment() {
for (MessageListenerContainer messageListenerContainer : kafkaListenerEndpointRegistry.getListenerContainers()) {
System.out.println(messageListenerContainer.getAssignedPartitions().isEmpty());
System.out.println(messageListenerContainer.toString());
System.out.println(embeddedKafkaBroker.getTopics().size());
System.out.println(embeddedKafkaBroker.getPartitionsPerTopic());
System.out.println(embeddedKafkaBroker.getBrokerAddress(0));
System.out.println(embeddedKafkaBroker.getBrokersAsString());
ContainerTestUtils.waitForAssignment(messageListenerContainer,
embeddedKafkaBroker.getPartitionsPerTopic());
}
#Test
public void testReceive() {
}
}
I would like somebody to shed some light on the following issues:
1.Can an instance of the class EmbeddedKafkaBroker be used to test data that comes from a remote broker, or is it only used for local tests, in which I would procude i.e send data to a topic that I created and consume data myself?
2.Is it possible to write a test class for a real kafka server? For instance to verify if a connection has been establish, or if a data has been read from a specific topic. What annotations, configurations, and classes would be needed in such case?
3.If I only want to consume data, do I have to provide the producer configuration in a config file (it would be strange, but all examples I have encountered so far did it)?
4.Do you know any resources (books, websites etc.) that show real examples of using kafka i.e. with a remote kafka server, with a procuder or a consumer only?
You don't need an embedded broker at all if you want to talk to an external broker only.
Yes, just set the bootstrap servers property appropriately.
No, you don't need producer configuration.
EDIT
#SpringBootApplication
public class So56044105Application {
public static void main(String[] args) {
SpringApplication.run(So56044105Application.class, args);
}
#Bean
public NewTopic topic() {
return new NewTopic("so56044105", 1, (short) 1);
}
}
spring.kafka.bootstrap-servers=10.0.0.8:9092
spring.kafka.consumer.enable-auto-commit=false
#RunWith(SpringRunner.class)
#SpringBootTest(classes = { So56044105Application.class, So56044105ApplicationTests.Config.class })
public class So56044105ApplicationTests {
#Autowired
public Config config;
#Test
public void test() throws InterruptedException {
assertThat(config.latch.await(10, TimeUnit.SECONDS)).isTrue();
assertThat(config.received.get(0)).isEqualTo("foo");
}
#Configuration
public static class Config implements ConsumerSeekAware {
List<String> received = new ArrayList<>();
CountDownLatch latch = new CountDownLatch(3);
#KafkaListener(id = "so56044105", topics = "so56044105")
public void listen(String in) {
System.out.println(in);
this.received.add(in);
this.latch.countDown();
}
#Override
public void registerSeekCallback(ConsumerSeekCallback callback) {
}
#Override
public void onPartitionsAssigned(Map<TopicPartition, Long> assignments, ConsumerSeekCallback callback) {
System.out.println("Seeking to beginning");
assignments.keySet().forEach(tp -> callback.seekToBeginning(tp.topic(), tp.partition()));
}
#Override
public void onIdleContainer(Map<TopicPartition, Long> assignments, ConsumerSeekCallback callback) {
}
}
}
There are some examples in this repository for bootstrapping real Kafka producers and consumers across a variety of configurations — plaintext, SSL, with and without authentication, etc.
Note: the repo above contains examples for the Effective Kafka book, which I am the author of. However, they can be used freely without the book and hopefully they make just as much sense on their own.
More to the point, here are a pair of examples for a basic producer and a consumer.
/** A sample Kafka producer. */
import static java.lang.System.*;
import java.util.*;
import org.apache.kafka.clients.producer.*;
import org.apache.kafka.common.serialization.*;
public final class BasicProducerSample {
public static void main(String[] args) throws InterruptedException {
final var topic = "getting-started";
final Map<String, Object> config =
Map.of(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092",
ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName(),
ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName(),
ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, true);
try (var producer = new KafkaProducer<String, String>(config)) {
while (true) {
final var key = "myKey";
final var value = new Date().toString();
out.format("Publishing record with value %s%n",
value);
final Callback callback = (metadata, exception) -> {
out.format("Published with metadata: %s, error: %s%n",
metadata, exception);
};
// publish the record, handling the metadata in the callback
producer.send(new ProducerRecord<>(topic, key, value), callback);
// wait a second before publishing another
Thread.sleep(1000);
}
}
}
}
/** A sample Kafka consumer. */
import static java.lang.System.*;
import java.time.*;
import java.util.*;
import org.apache.kafka.clients.consumer.*;
import org.apache.kafka.common.serialization.*;
public final class BasicConsumerSample {
public static void main(String[] args) {
final var topic = "getting-started";
final Map<String, Object> config =
Map.of(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092",
ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName(),
ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName(),
ConsumerConfig.GROUP_ID_CONFIG, "basic-consumer-sample",
ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest",
ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
try (var consumer = new KafkaConsumer<String, String>(config)) {
consumer.subscribe(Set.of(topic));
while (true) {
final var records = consumer.poll(Duration.ofMillis(100));
for (var record : records) {
out.format("Got record with value %s%n", record.value());
}
consumer.commitAsync();
}
}
}
}
Now, these are obviously not unit tests. But with very little rework they could be turned into one. The next step would be to remove Thread.sleep() and add assertions. Note, since Kafka is inherently asynchronous, naively asserting a published message in a consumer immediately after publishing will fail. For a robust, repeatable test, you may want to use something like Timesert.

How do I implement the business logic of a MVC in a seperate class from the Servlet?

I have experimented putting the class in the servlets controller but I have trouble with the constructor and setting method access. I read its improves efficiency to have the business logic seperate, I even tried putting it in the JavaBean but I don't yet know how to send parameters from the controller to it. I still mave much to learn, just working on a project.
You can use EJB to separate business logic from Presentation tier i.e (Servlets and JSP) in JavaEE platform. If your project doesn't have much business logic code then simply use Java POJO classes. This example gives very raw idea. you can use web frameworks which have built in MVC design.
Controller:
Use Servlets to control navigation or perform other tasks against HTTP requests.
#WebServlet("/LoginServlet")
public class LoginServlet extends HttpServlet {
private static final long serialVersionUID = 1L;
public LoginServlet() {
super();
}
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
LoginManager loginManager=new LoginManager();
if(loginManager.isValidUser("getUserID from request Params","password from request params")){
//initialize user session and redirect to dashboard
//response.sendRedirect("/userhome.jsp");
}else{
//display failure messages. etc...
//response.sendRedirect("/login.jsp");
}
}
}
Model:
POJO which contains set of methods for login related operations.
public class LoginManager {
private Connection con;
public LoginManager() {
}
private void initConnection(){
//register driver class and create a new connection
//you can create separate DBUtils class to get new connections
//to prevent boilerplate code.
//make new connection to database
// con=..
}
private void closeConnection() throws SQLException{
con.close();
}
public boolean isValidUser(String user,String password) throws SQLException{
initConnection();
PreparedStatement pstm=con.prepareStatement("select 1 from users where userID = ? and password=?");
//set userID and password params
ResultSet rs=pstm.executeQuery();
if (rs.next()){
if(checkpassword.....)
return true;
}
closeConnection();
return false;
}
}
View:
pages like login.jsp and userhome.jsp pages are views;

Spring MVC - PropertyEditor not called during ModelAttribute type conversion

Using Spring 3.2.3, I'm trying to implement a simple CRUD controller that handles REST-ful URLs. It relies on a PropertyEditor to convert a path variable to a BusinessService entity by loading it from an application service. Code is as follows:
#Controller
public class BusinessServiceController {
#Autowired
private BusinessServiceService businessSvcService;
public BusinessServiceController() {
}
#InitBinder
public void initBinder(final WebDataBinder binder) {
binder.registerCustomEditor(BusinessService.class, new BusinessServicePropertyEditor(businessSvcService));
}
#RequestMapping(value = "/ui/account/business-services/{businessSvc}", method = RequestMethod.POST, consumes = MediaType.APPLICATION_FORM_URLENCODED_VALUE)
public ModelAndView update(#ModelAttribute("businessSvc") #Valid final BusinessService businessSvc, final BindingResult result,
final RedirectAttributes redirectAttribs) throws UnknownBusinessServiceException {
ModelAndView mav;
if (result.hasErrors()) {
mav = new ModelAndView("/business-service/edit");
}
else {
businessSvcService.updateBusinessService(XSecurity.principal().getId(), businessSvc);
mav = new ModelAndView("redirect:/ui/account/business-services");
redirectAttribs.addFlashAttribute("message", Message.info("businessService.updated", businessSvc.getTitle()));
}
return mav;
}
}
public class BusinessServicePropertyEditor extends PropertyEditorSupport {
private final BusinessServiceService businessSvcService;
public BusinessServicePropertyEditor(final BusinessServiceService businessSvcService) {
this.businessSvcService = businessSvcService;
}
#Override
public String getAsText() {
final BusinessService svc = (BusinessService) getValue();
return Long.toString(svc.getId());
}
#Override
public void setAsText(final String text) {
final BusinessService svc = businessSvcService.getBusinessService(Long.parseLong(text));
setValue(svc);
}
}
According to SPR-7608, starting from Spring 3.2, #ModelAttribute method argument resolution checks if a path variable by the same name exists (it does here), in which case it tries to convert that path variable's value to the target parameter type through registered Converters and PropertyEditors. This is not what I'm experiencing. When I inspect what ServletModelAttributeMethodProcessor does, it clearly uses the request DataBinder's ConversionService to perform type conversion, which does not consider registered PropertyEditors, and hence BusinessServicePropertyEditor#setAsText is never called.
Is this a configuration problem or an actual bug?
Thanks for your help!
Spring's ConversionService and Converters are replacement for standard Java Beans PropertyEditors.
You need to implement Converter instead of PropertyEditor if this feature is based purely on conversion service.
To register your custom converters in WebDataBinder you might use ConfigurableWebBindingInitializer or #InitBinder method.

Spring MVC test case

Am new to Spring MVC, i have written web servise using spring MVC and resteasy. My controller is working fine, now need to write testcase but i tried writtig but i never succed am also getting problem in autowiring.
#Controller
#Path("/searchapi")
public class SearchAPIController implements ISearchAPIController {
#Autowired
private ISearchAPIService srchapiservice;
#GET
#Path("/{domain}/{group}/search")
#Produces({"application/xml", "application/json"})
public Collections getSolrData(
#PathParam("domain") final String domain,
#PathParam("group") final String group,
#Context final UriInfo uriinfo) throws Exception {
System.out.println("LANDED IN get****************");
return srchapiservice.getData(domain, group, uriinfo);
}
}
can anyone give me sample code for Test case in spring mvc.
"Spring-MVC" Test case could seem like this using mock objects, for example we want to test my MyControllerToBeTest:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration("/spring.xml")
public class MyControllerTest {
private MockHttpServletRequest request;
private MockHttpServletResponse response;
private MyControllerToBeTested controller;
private AnnotationMethodHandlerAdapter adapter;
#Autowired
private ApplicationContext applicationContext;
#Before
public void setUp() {
request = new MockHttpServletRequest();
response = new MockHttpServletResponse();
response.setOutputStreamAccessAllowed(true);
controller = new MyControllerToBeTested();
adapter = new AnnotationMethodHandlerAdapter();
}
#Test
public void findRelatedVideosTest() throws Exception {
request.setRequestURI("/mypath");
request.setMethod("GET");
request.addParameter("myParam", "myValue");
adapter.handle(request, response, controller);
System.out.println(response.getContentAsString());
}
}
but i don't have any experience with REST resource testing, in your case RestEasy.
If you want to test the full service inside the container you can have a look at the REST Assured framework for Java. It makes it very easy to test and validate HTTP/REST-based services.

Resources