-
Notifications
You must be signed in to change notification settings - Fork 572
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Explore JSON validation #949
Conversation
@gsmet I was thinking about something like next: ConstraintMapping newMapping = config.createConstraintMapping();
newMapping
.type( "FirstJsonType" )
.property( "prop1" ).constraint( new NotNullDef() ).constraint( new SizeDef().max( 100 ) )
.type( "SecondJsonType" )
.property( "prop2" ).constraint( new EmailDef() )
.property( "prop3" ) .valid("FirstJsonType");
config.addMapping( newMapping );
Validator validator = config.buildValidatorFactory().getValidator(); defining the "types" as unique string names, and in case of cascading - passing a type-string as an argument. |
How would you do that? I can see how you pass the initial type to the initial Note that we might want a totally different API for JSON e.g. having a specific |
I was thinking to pass it while defining constraints: .property( "prop3" ) .valid( "FirstJsonType" ); which would mean that we want to cascade the validation on property |
Ah yes, that would work. Wondering if we should make the type a class to support inheritance in the rules. Another option would be to define a property path, something like It's probably less complicated than the As for the architecture, I think I would make it a totally separate module with a |
Funny, I would have mixed the two ideas:
That way you get the best of both worlds. The user can re-use type definitions, and the constraint mapping API stays simple and to the point: just map constraint to properties. |
Yes the only drawback is that you would need to define the whole schema. But indeed, I think the engine will need the types if we don't want to make significant changes to it. |
(Sorry, clicked the wrong button...) Just wanted to mention that you would only need to define the part of the schema that is actually validated, which may make a big difference in some cases. |
- introduced new helper in cdi module to deal with new PropertyFilter spi.
I played a bit more with this and here's what I've ended up with so far: JsonConstraintMapping mapping = /*get the mapping*/;
mapping.type( User.class )
// uses string as a type by default
.property( "name" )
.constraint( new NotNullDef() ).constraint( new LengthDef().min( 3 ) )
// if it's not a string we need to pass a type with the name for it
.property( "age", Integer.class )
.constraint( new MinDef().value( 18 ) )
// in case of complex type (inner json) we can pass a type and use valid() on it
.property( "address", Address.class )
.valid();
// and define constraint for the child json as for another type
mapping.type( Address.class )
.property( "street" )
.constraint( new NotBlankDef() )
.property( "building_number", Long.class )
.constraint( new NotNullDef() ).constraint( new MinDef().value( 1 ) ); We would need to pass a type of each property when defining constraints for it as otherwise we won't be able to create a correct I have all this metadata definition part ready and started to look into how to integrate it with the rest of the engine, and that's where there are some problems. Through all the engine code we have bean and it's type bound toughener for example in public class ValidationContext<T> {
/**
* The root bean of the validation.
*/
private final T rootBean;
/**
* The root bean class of the validation.
*/
private final Class<T> rootBeanClass;
/**
* The metadata of the root bean.
*/
private final BeanMetaData<T> rootBeanMetaData;
} which might work if we pass beanMetaDataManager.getBeanMetaData( value.getClass() ) as the value would always be json in our case but we need to get the type assigned to that particular property instead. |
Yeah, from the beginning, I thought a Not sure about the amount of duplication we would have though. That being said, About Sorry a lot of handwaving :). HTH though. |
020e49c
to
da03af2
Compare
By the way, minor nitpicking but we discussed it with my colleague Yoann and we both think having a default type is a bad idea. Let's make the user use a consistent API and define the type explicitly. Yoann is also a bit skeptical about the use of classes to reference the types. I was the one suggesting it initially as I thought it would be easier to fit in the API but if we have a separate |
I did some more investigation around this. I've followed the ideas around PropertyHolder and string names for mappings. So far here's an updated version for the mapping: mapping.type( "User.class" )
// uses string as a type by default
.property( "name", String.class )
.constraint( new NotNullDef() ).constraint( new LengthDef().min( 3 ) )
// if it's not a string we need to pass a type with the name for it
.property( "age", Integer.class )
.constraint( new MinDef().value( 18 ) )
// in case of complex type (inner json) we can pass a type and use valid() on it
// note that in case a property is a propertyholder itself the starting method
// name is different
.propertyHolder( "address", "Address.class" )
.valid();
// and define constraint for the child json as for another type
mapping.type( "Address.class" )
.property( "street", String.class )
.constraint( new NotBlankDef() )
.property( "building_number", Long.class )
.constraint( new NotNullDef() ).constraint( new MinDef().value( 1 ) ); Main change from above mapping is usage of strings instead of classes. This required to change the logic for inner "property holders" as for them we need to pass a mapping name which is now a string I've added a different method for such case: public interface PropertyHolderTarget {
PropertyConstraintMappingContext propertyHolder(String property, String mappingName);
} That was the easy part :) // for json
PropertyHolderConstraintMapping jsonMapping = configuration.createPropertyHolderConstraintMapping(
JavaxJsonPropertyHolderPropertyExtractor.class
);
for Map
PropertyHolderConstraintMapping mapMapping = configuration.createPropertyHolderConstraintMapping(
MapPropertyHolderPropertyExtractor.class
); this argument of public interface PropertyHolderExtractor{
Property createProperty(String proeprtyName, Class<?> proeprtyType);
Property createPropertyHolder(String proeprtyName, String mappingName);
} this extractor than can be used to create properties used in metadata.
Another option might be to introduce some sort of a wrapper around actual properties, something like public interface PropertyHolderProperty {
Object getPropertyValue(String propertyName, Class<?> propertyType);
} and then each time we need to call And another question that I have - how crazy the changes can be around class MyBean {
@NotNull
private String name;
@ValidPropertyHolder("mapping")
private JsonObject myJsonObject;
} Initially I was trying to fit it all within existing contexts but it feel like hacking things to make them work. At this moment I was thinking about extracting a couple of interfaces from these contexts that we can later pass around. For example in case of public T getRootBean() {
return rootBean;
}
public Class<T> getRootBeanClass() {
return rootBeanClass;
}
public BeanMetaData<T> getRootBeanMetaData() {
return rootBeanMetaData;
}
public Executable getExecutable() {
return executable;
} are not used in public T getRootPropertyHolder() {
return rootPropertyHolder;
}
public PropertyHolderMetaData getPropertyHolderMetaData() {
return propertyHolderMetaData;
} But before going further with this I wanted to confirm if that would be ok :) that's why the question - "how crazy the changes can be?" If so I'll start working on the decoupling the contexts before getting back to property holders. |
So let's start with the easy question: "how crazy the changes can be around There are no crazy ideas. IMO, there are only a few guidelines we have to follow: avoid creating new objects in the hot paths and keep the code readable. Apart from that, feel free to do whatever changes you see fit as 6.1 is a major version. I'm not very happy with I concur with you that a specific Now the rest
For me the property value extractor was a way to extract the value from a PropertyHolder. So basically, you would have a path of property value extractors as you have a path of value extractor for container elements at the moment. What I envisioned (without experimenting with it or playing with the code, so consider this with care):
In any case, you would know which mapping to apply to build the So all in all, I would leverage most of the existing infrastructure. Not sure it works but it would be worth a try IMHO. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @marko-bekhta, thanks a lot for this huge piece of work. Just a few comments from the side line here (didn't take a very detailed look, just a few things caught my intention). On a general level, it seems as if there's some redundancy added now to the programmatic API? Also, it's not quite clear yet to me whether this targets JSON specifically or a more general means of "free-form" validation (i.e. anything non-JavaBeans-based)?
@Incubating | ||
public interface HibernateFreeFormValidator { | ||
|
||
Set<ConstraintViolation<JsonObject>> validateJson(JsonObject json, Class<?> typeToValidate, Class<?>... groups); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is typeToValidate
? Also the scope of the interface is a bit blurry to me. The class comment describes it as a generic on, but then the method signature is tied to JSON. Can we make it truly generic?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In this first experiment, Marko did use a class as a way to define the JSON type, thus this parameter here and the beanClass
attribute you mention a bit later.
We already decided we don't want that so it will be changed.
* @author Gunnar Morling | ||
* @author Kevin Pollet <[email protected]> (C) 2011 SERLI | ||
*/ | ||
public interface Cascadable<C extends Cascadable<C>> { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is it that all these interfaces are copied for JSON?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So the issue is that when declaring something as cascading, we need to define the new JSON type that will be used.
Maybe we could try to use some more generics in the DSL and have this difference managed this way (e.g. having a generics for the Cascadable DSL and inject it where it makes sense).
Not sure it's doable nor very future proof.
* | ||
* @author Marko Bekhta | ||
*/ | ||
public class ConstrainedProperty extends AbstractConstrainedElement { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FWIW, ConstrainedElement
always was meant to represent the "physical" elements of a type (i.e. fields, getters etc.), whereas the unified handling of fields and getters as "properties" only was part of the "aggregated model". I'm not sure whether it's a good idea to mix these two levels now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Didn't know that, thanks for the clarification. I'll take this into account in the other PR for the "abstraction".
|
||
private final ConfigurationSource source; | ||
|
||
private final Class<T> beanClass; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the beanClass in case of JSON?
In terms of referencing types in the programmatic config API, that's what I had in mind for that:
The current form ( |
@gunnarmorling I see that @gsmet answered almost everything. This PR is a bit outdated now. I do have other patches in this direction of free-form validation. Right now we wanted to get the preparation work done first:
As for the "type" we ended up agreeing on using String as a mapping name. But I think that your idea with |
Closing in favor of #989 . |
This PR is not to be merged and only to see if we need to do anything else in #938 on top of which it is based. The last commit has next changes:
JsonMetadataProvider
which returns a couple of constraints for properties like name or email.JsonProperty
andJsonConstrainableType
which extend from the property API that we have. Note that I've hardcoded the list of possible properties intoJsonConstrainableType
but in real case I think we wouldn't need that at all and we would just programmatically define some constraints (probably) without any validation if the property is present or not ...I've copied this from the comment #938 (comment)