How to use hibernate-validator to validate interface parameters - bean-validation

I have the following code to implement interface input parameter validation and now want to use hibernate-validator to do this
public class Order
{
private String orderNo;
private String orderId;
private String status;
private String startTime;
private String endTime;
//getter and setter...
}
public class OrderService
{
public Object search(Order order) throws Exception
{
String message = "";
if (order.getOrderId().isEmpty() && order.getOrderNo().isEmpty() && order.getStatus().isEmpty())
{
if (order.getStartTime().isEmpty() && order.getEndTime().isEmpty())
message = "xxx";
}
if (!message.isEmpty())
throw new Exception(message);
Object result = null;
// splice sql according to the attribute of order and get the result
// result = sql query result
return result;
}
}
I tried to use Hibernate-validator's group to achieve this, but if there are more parameters, I need to write a lot of groups, which seems stupid. I have more than 100 interfaces, and will be added later, using Class-level constraints would be a good idea choice?
Below is the code trying to use Hibernate-validator's group implementation:
public class Order
{
#Empty(groups = One.class)
#NotEmpty(groups = Two.class)
private String orderNo;
#Empty(groups = One.class)
#NotEmpty(groups = Three.class)
private String orderId;
#Empty(groups = One.class)
#NotEmpty(groups = Four.class)
private String status;
#NotEmpty(groups = One.class)
private String startTime;
#NotEmpty(groups = One.class)
private String endTime;
}
public class BeanValidatorUtils
{
static Validator validator;
static
{
HibernateValidatorConfiguration configuration = Validation.byProvider(HibernateValidator.class).configure();
ValidatorFactory factory = configuration.failFast(true).buildValidatorFactory();
validator = factory.getValidator();
}
public static <T> void validation(T beanParam) throws AppException
{
if (!containsGroup(beanParam, One.class))
return;
Set<ConstraintViolation<T>> validate = validator.validate(beanParam, One.class);
ConstraintViolation<T> constraintViolation = validate.iterator().next();
String firstViolationMessage = constraintViolation.getMessage();
if (!validate.isEmpty() && containsGroup(beanParam, Two.class))
{
validate = validator.validate(beanParam, Two.class);
}
if (!validate.isEmpty() && containsGroup(beanParam, Three.class))
{
validate = validator.validate(beanParam, Three.class);
}
if (!validate.isEmpty())
throw new AppException(firstViolationMessage);
}
private static boolean containsGroup(Object bean, Class<?> groupClazz)
{
// ...
}
}
Is there any other way to use Hibernate-validator to verify the Order in the search method?

As you are trying to make a validation decision based on the state of multiple properties of the Order you might want to explore these 3 options:
Class level constraint
This would mean that you have to create your own constraint annotation (let's say #ValidOrder) and a corresponding ValidOrderValidator
#Target({ METHOD, FIELD, ANNOTATION_TYPE, CONSTRUCTOR, PARAMETER, TYPE_USE })
#Retention(RUNTIME)
#Documented
#Constraint(validatedBy = { ValidOrderValidator.class })
#interface ValidOrder {
String message() default "{message.key}";
Class<?>[] groups() default { };
Class<? extends Payload>[] payload() default { };
}
public class ValidOrderValidator implements ConstraintValidator<ValidOrder, Order> {
#Override
public boolean isValid(Order order, ConstraintValidatorContext constraintValidatorContext) {
//null values are valid
if ( order == null ) {
return true;
}
if (order.getOrderId().isEmpty() && order.getOrderNo().isEmpty() && order.getStatus().isEmpty()) {
if ( order.getStartTime().isEmpty() && order.getEndTime().isEmpty() ) { return false; }
}
return true;
}
}
You can also check this post for more detailed info on how to add new constraints using ServiceLoader.
#ScriptAssert constraint
If your validation logic is relatively simple and you either already have a dependency or are willing to add one for a scripting engine, you can consider using the #ScriptAssert constraint. This is similar to the previous option but you don't need to create annotations and validator implementations you just have to put script logic into this constraint:
#ScriptAssert(lang = "groovy", script = "your validation script logic")
class Order {
//...
}
#AssertTrue constraint
Last but not least, one of the easiest ways to address such validation is to use #AssertTrue constraint on a getter with validation logic inside the Order class:
class Order {
//...
#AssertTrue
public boolean isValidOrder() {
// your validation logic
}
}
Using any of these 3 approaches, you'd be able to make a validation decision based on multiple properties of the Order class.
As for validation group usage - you can leverage using the groups if you need to pass the same Order object into multiple different methods/interfaces where a different set of validation rules need to be applied in each of them. Let's say, in one case, you have to create an order, and half of the fields can be null, but then in the other - you want to update it, and everything should be present.

Related

Current datetime shouldnt pass the validation using #Past annotation

I need the #Past to error when the field is set to now. I realize that the now value on the field, and the now value used when the validator is comparing would be slightly different, thus the need to set the tolerance in hibernate validator.
Problem is that i can not get this to work. Here is the junit:
#Test
public void testHibernateValidator_withPast_withTodayDate() {
// populates with 'now'
MyFormWithPast form = new MyFormWithPast();
form.setDt(OffsetDateTime.now(Clock.systemUTC()));
ValidatorFactory factory = Validation.byProvider(HibernateValidator.class)
.configure()
.clockProvider(() -> Clock.systemUTC())
// adds tolerance so that when comparing, the form dt and 'now' is considered equal,
// therefore dt is not a past datetime
.temporalValidationTolerance(Duration.ofMinutes(1))
.buildValidatorFactory();
Validator validator = factory.getValidator();
Set<ConstraintViolation<MyFormWithPast>> errors = validator.validate(form);
// needs to fail, since 'now' shouldn't be considered 'past'
assertFalse("now shoudnt be considered as Past", errors.isEmpty());
}
public static class MyFormWithPast {
#Past
private OffsetDateTime dt;
public void setDt(OffsetDateTime dt) {
this.dt = dt;
}
public OffsetDateTime getDt() {
return dt;
}
}
I expect the validation to fail when i put in 'now' in the field, as 'now' shouldnt be considered as 'past'. What did i miss ?
The temporal validation tolerance was designed to be more lenient, not stricter. You want it to be stricter.
I think you will need your own constraints to deal with what you want to do.
Just want to share my current solution, adding a default of 1 minute forward tolerance so that inputted 'now' is not considered a 'past'.
The annotation:
/**
* Validates that the date is of the past, with forward tolerance of 1 minute,
* to offset the time to create a 'now' instance to compare to.
* The usage is when user selects 'today' in the UI, we dont want it to be considered as 'Past'
* https://stackoverflow.com/questions/60341963/current-datetime-shouldnt-pass-the-validation-using-past-annotation
* Annotation is applicable to {#link OffsetDateTime}.
*/
#Target({ METHOD, FIELD, ANNOTATION_TYPE, CONSTRUCTOR, PARAMETER })
#Retention(RUNTIME)
#Documented
#Constraint(validatedBy=StrictPastValidator.class)
public #interface StrictPast {
public static final String MESSAGE = "{constraints.StrictPast.message}";
/**
* #return The error message template.
*/
String message() default MESSAGE;
/**
* #return The groups the constraint belongs to.
*/
Class<?>[] groups() default { };
/**
* #return The payload associated to the constraint
*/
Class<? extends Payload>[] payload() default {};
}
The validator:
public class StrictPastValidator implements ConstraintValidator<StrictPast, Object> {
#Override
public void initialize(StrictPast annotation) {
}
#Override
public boolean isValid(Object input, ConstraintValidatorContext ignored) {
if (input == null) {
return true;
} else if (input instanceof OffsetDateTime) {
return isValidOffsetDateTime((OffsetDateTime) input);
}
throw new IllegalStateException("StrictPastValidator is not applicable to the field type " + input.getClass().getName());
}
private boolean isValidOffsetDateTime(OffsetDateTime input) {
OffsetDateTime plusSecondsDt = input.plusSeconds(Duration.ofMinutes(1).getSeconds());
return plusSecondsDt.isBefore(OffsetDateTime.now(Clock.systemUTC()));
}
}

DynamoDBMapper - DynamoDBDocument and local secondary indexes

I am using DynamoDB SDK 1.11.185 version. I have a Client entity mapped to Clients DynamoDB table through DynamoDBMapper annotations. One of the attributes contains nested values (see code examples below). I want to add a local secondary index to zone attribute from Options class. Once I want to save an object, I got a null pointer exception from this class
com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperTableModel
line 415. Looks like a bug from the libraries.
P.D. If I generate the index over clientId attribute from Client entity, everything work fine.
#DynamoDBTable(tableName="Clients")
public class Client {
private String id;
private String clientId;
private Date created;
private Options options;
public Client() {
}
public Client(String id, String clientId, Options options) {
this.id = id;
this.clientId = clientId;
this.options = options;
this.created = new Date();
}
#DynamoDBHashKey
public String getId() { return id; }
public void setId(String id) { this.id = id; }
#DynamoDBAttribute
public String getClientId() { return clientId; }
public void setClientId(String clientId) { this.clientId = clientId; }
#DynamoDBAttribute
public Options getOptions() { return options; }
public void setOptions(Options options) { this.options = options; }
#DynamoDBRangeKey
public Date getCreated() { return created; }
public void setCreated(Date created) { this.created = created; }
}
#DynamoDBDocument
public class Options {
private String zone;
public Options() {
}
public Options(String zone) {
this.zone = zone;
}
#DynamoDBIndexRangeKey(localSecondaryIndexName = "zone-index")
public String getZone() { return zone; }
public void setZone(String zone) { this.zone = zone; }
}
**************************** EDITED *************************
Correct answer by #Raniz and
Indexing on nested field
It can be done using JSON attributes though:
DynamoDB create index on map or list type
You can't use nested attributes in the key schema for an index.
I assume you've created the index with options.zone in the schema which means that DynamoDB is expecting a top-level attribute with that exact name - i.e. an attribute named options.zone and not an attribute named zone nested under the options attribute.
Excerpt from here:
The key schema for the index. Every attribute in the index key schema
must be a top-level attribute of type String, Number, or Binary. Other
data types, including documents and sets, are not allowed. Other requirements for the key schema depend on the type of index:
For a global secondary index, the partition key can be any scalar attribute of the base table. A sort key is optional, and it too can be
any scalar attribute of the base table.
For a local secondary index, the partition key must be the same as the base table's partition key, and the sort key must be a non-key
base table attribute.
To use zone in your index schema you'll need to either move or duplicate it so it's available on the top level. The easiest way of accomplishing this would probably be to add a getter to Client that returns options.zone:
#DynamoDBIndexRangeKey(localSecondaryIndexName = "zone-index")
public String getZone() {
if (options != null) {
return options.getZone();
}
return null;
}

Better way to cache a Model

My simple repository's getAll method:
public List<ListModel> GetAllLists()
{
using (MySqlConnection connection = new MySqlConnection(this.connectionString))
{
return connection.Query<ListModel>("SELECT * FROM projectx.lists").AsList();
}
}
I'm using this class I've found here in so to handle caching:
public class CacheUtils : ICacheService
{
public TValue Get<TValue>(string cacheKey, Func<TValue> getItemCallback, double durationInMinutes = 120) where TValue : class
{
TValue item = MemoryCache.Default.Get(cacheKey) as TValue;
if (item == null)
{
Debug.WriteLine("Not cached");
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(durationInMinutes));
}
else
Debug.WriteLine("Cached!");
return item;
}
public TValue Get<TValue, TId>(string cacheKeyFormat, TId id, Func<TId, TValue> getItemCallback, double durationInMinutes = 120) where TValue : class
{
string cacheKey = string.Format(cacheKeyFormat, id);
TValue item = MemoryCache.Default.Get(cacheKey) as TValue;
if (item == null)
{
item = getItemCallback(id);
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(durationInMinutes));
}
return item;
}
}
Home controller:
public ActionResult Index()
{
ListRepository listRep = new ListRepository();
CacheUtils cache = new CacheUtils();
return View(cache.Get("lists", listRep.GetAllLists));
}
Question, is there a better way of handling cache than calling the helper from the controller? Ideally, it should be inside the repository method. But do I need to repeat the check for existing cache data on every single method of the repository? Ie.:
public List<ListModel> GetAllLists()
{
var lists = Cache["lists"];
if(lists == null)
{
using (MySqlConnection connection = new MySqlConnection(this.connectionString))
{
lists = connection.Query<ListModel>("SELECT * FROM projectx.lists").AsList();
}
Cache["lists"] = lists;
}
return ((List<ListModel>)lists);
}
Use a decorator pattern and don't polute business or ui with caching logic.
Wire it up with something like ninject (or poor bastards if you dont want to add a DI) I'd recommend marking it as single instance.
Benefits include:
Adding a invalidating method like void Save(ListModel) is easy to
invalidate the cache.
Your top layer and bottom layer know nothing
about the fact they have been cached.
You can also decorate again to add in logging, profiling, etc
You can also control the caching life cycle
you don't polute the controller level with caching logic
easy to remove
So something like the below would work. For how too add decorators in ninject see https://stackoverflow.com/a/8910599/1073280
public class MyHomeController
{
private readonly IListCrud _listcrud;
public MyHomeController(IListCrud listcrud)
{
_listcrud = listcrud;
}
public ActionResult Index()
{
return View(_listcrud.GetAllLists());
}
}
public interface IListCrud
{
List<ListModel> GetAllLists();
}
public class ListCrud : IListCrud
{
public List<ListModel> GetAllLists()
{
using (MySqlConnection connection = new MySqlConnection(this.connectionString))
{
return connection.Query<ListModel>("SELECT * FROM projectx.lists").AsList();
}
}
}
public class ListCrudCache : IListCrud
{
private readonly ICacheService _cache;
private readonly IListCrud _inner;
public ListCrudCache(ICacheService cache, IListCrud inner)
{
_cache = cache;
_inner = inner;
}
public List<ListModel> GetAllLists()
{
return _cache.Get("lists", _inner.GetAllLists);
}
}
Opinion: maybe just to keep the code small but be careful using select * with a ORM. if someone renames or removes a column you wont have any easy to unit test/detect failure mechanism.
In my opinion it shouldn't be in the repository, as it (to me) smells like violation or SRP. Caching should be a higher level service above the repository.
You need to think about what actually needs the benefits of the caching. If the caching is for speeding up the WEB API interface, then having it in the controller is the best way by far. If you need caching elsewhere too, consider introducing some middle layer service classes and put caching there, but I would always make it optional in some way.

Type check in JSR-303 custom validator initialize method

I'm attempting to create a class level JSR-303 validation definition that checks that one property occurs before another in time. Because the this validation only makes sense for Calendar properties I was wondering if it is possible to test the property type in the initialize method.
My annotation definition is:
#Target({TYPE, ANNOTATION_TYPE})
#Retention(RUNTIME)
#Constraint(validatedBy = TemporalSequenceValidator.class)
#Documented
public #interface TemporalSequence {
String message() default "{uk.co.zodiac2000.vcms.constraints.TemporalSequence}";
Class<?>[] groups() default {};
Class<? extends Payload>[] payload() default {};
String first();
String second();
}
and the validator implementation:
public class TemporalSequenceValidator implements
ConstraintValidator<TemporalSequence, Object> {
private String firstFieldName;
private String secondFieldName;
#Override
public void initialize(final TemporalSequence constraintAnnotation) {
firstFieldName = constraintAnnotation.first();
secondFieldName = constraintAnnotation.second();
// Is it possible to test type of firstFieldName and
// secondFieldName properties here?
}
#Override
public boolean isValid(final Object value, final ConstraintValidatorContext context) {
// omitted
}
}
Is this a sensible thing to do? What approach would you suggest I use if it is? And what action should occur if the properties are not of the correct type?
You can't really do the check in initialize() since you can't access the validated object there. Instead you could check the type of the fields of the validated object in isValid() using reflection:
if ( !Calendar.class.isAssignableFrom(
value.getClass().getField( firstFieldName ).getType() ) ) {
throw new ValidationException( "Field " + firstFieldName + " is not of type Calendar." );
}

Grails bind request parameters to enum

My Grails application has a large number of enums that look like this:
public enum Rating {
BEST("be"), GOOD("go"), AVERAGE("av"), BAD("ba"), WORST("wo")
final String id
private RateType(String id) {
this.id = id
}
static public RateType getEnumFromId(String value) {
values().find {it.id == value }
}
}
If I have a command object such as this:
class MyCommand {
Rating rating
}
I would like to (for example) automatically convert a request parameter with value "wo" to Rating.WORST.
The procedure for defining custom converters is described here (in the context of converting Strings to Dates). Although this procedure works fine, I don't want to have to create a class implementing PropertyEditorSupport for each of my enums. Is there a better alternative?
I found a solution I'm pretty happy with.
Step 1: Create an implementation of PropertyEditorSupport to convert text to/from the relevant Enum
public class EnumEditor extends PropertyEditorSupport {
private Class<? extends Enum<?>> clazz
public EnumEditor(Class<? extends Enum<?>> clazz) {
this.clazz = clazz
}
public String getAsText() {
return value?.id
}
public void setAsText(String text) {
value = clazz.getEnumFromId(text)
}
}
Step 2: Define a class that registers EnumEditor as a converter for the various enum classes. To change the list of enum classes that are bindable by id, just modify BINDABLE_ENUMS
public class CustomPropertyEditorRegistrar implements PropertyEditorRegistrar {
private static final String REQUIRED_METHOD_NAME = 'getEnumFromId'
// Add any enums that you want to bind to by ID into this list
private static final BINDABLE_ENUMS = [Rating, SomeOtherEnum, SomeOtherEnum2]
public void registerCustomEditors(PropertyEditorRegistry registry) {
BINDABLE_ENUMS.each {enumClass ->
registerEnum(registry, enumClass)
}
}
/**
* Register an enum to be bound by ID from a request parameter
* #param registry Registry of types eligible for data binding
* #param enumClass Class of the enum
*/
private registerEnum(PropertyEditorRegistry registry, Class<? extends Enum<?>> enumClass) {
boolean hasRequiredMethod = enumClass.metaClass.methods.any {MetaMethod method ->
method.isStatic() && method.name == REQUIRED_METHOD_NAME && method.parameterTypes.size() == 1
}
if (!hasRequiredMethod) {
throw new MissingMethodException(REQUIRED_METHOD_NAME, enumClass, [String].toArray())
}
registry.registerCustomEditor(enumClass, new EnumEditor(enumClass))
}
}
Step 3: Make Spring aware of the registry above by defining the following Spring bean in grails-app/conf/spring/resources.grooovy
customPropertyEditorRegistrar(CustomPropertyEditorRegistrar)
So the default Databinding binds on the Enum name and not a separately defined property of the Enum. You can either create your own PropertyEditor as you have mentioned or do a work-around similar to this:
class MyCommand {
String ratingId
Rating getRating() {
return Rating.getEnumFromId(this.ratingId)
}
static constraints = {
ratingId(validator:{val, obj -> Rating.getEnumFromId(val) != null })
}
}

Resources