JObject.ToBsonDocument dropping values - json.net

I'm inserting raw JSON into a collection and finding that what is stored in the database is missing the values. For example, my collection is a collection of BsonDocuments:
_products = database.GetCollection<BsonDocument>("products");
The code to insert the JSON into the collection:
public int AddProductDetails(JObject json)
{
var doc = json.ToBsonDocument(DictionarySerializationOptions.Document);
_products.Insert(doc);
}
The JSON that is passed in looks like this:
{
"Id": 1,
"Tags": [
"book",
"database"
],
"Name": "Book Name",
"Price": 12.12
}
But, what is persisted in the collection is just the properties with no values.
{
"_id": {
"$oid": "5165c7e10fdb8c09f446d720"
},
"Id": [],
"Tags": [
[],
[]
],
"Name": [],
"Price": []
}
Why are the values being dropped?

This does what I was expecting.
public int AddProductDetails(JObject json)
{
BsonDocument doc = BsonDocument.Parse(json.ToString());
_products.Insert(doc);
}

I ran into this issue when I had a C# class with a property of type JObject.
My Solution was to create JObjectSerializer for MondoDB and add the attribute to the property so Mongo serializer uses it. I assume if I tried hard enough I could register the below serializer in Mongo as the global one for this type as well.
Register serializer for property processing:
[BsonSerializer(typeof(JObjectSerializer))]
public JObject AdditionalData { get; set; }
The serializer itself:
public class JObjectSerializer : SerializerBase<JObject> // IBsonSerializer<JObject>
{
public override JObject Deserialize(BsonDeserializationContext context, BsonDeserializationArgs args)
{
var myBSONDoc = BsonDocumentSerializer.Instance.Deserialize(context);
return JObject.Parse(myBSONDoc.ToString());
}
public override void Serialize(BsonSerializationContext context, BsonSerializationArgs args, JObject value)
{
var myBSONDoc = MongoDB.Bson.BsonDocument.Parse(value.ToString());
BsonDocumentSerializer.Instance.Serialize(context, myBSONDoc);
}
}

The problem when using JObject.ToString, BsonDocument.Parse, etc. is the performance is not very good because you do the same operations multiple times, you do string allocations, parsing, etc.
So, I have written a function that converts a JObject to an IEnumerable<KeyValuePair<string, object>> (only using enumerations), which is a type usable by one of the BsonDocument constructors. Here is the code:
public static BsonDocument ToBsonDocument(this JObject jo)
{
if (jo == null)
return null;
return new BsonDocument(ToEnumerableWithObjects(jo));
}
public static IEnumerable<KeyValuePair<string, object>> ToEnumerableWithObjects(this JObject jo)
{
if (jo == null)
return Enumerable.Empty<KeyValuePair<string, object>>();
return new JObjectWrapper(jo);
}
private class JObjectWrapper : IEnumerable<KeyValuePair<string, object>>
{
private JObject _jo;
public JObjectWrapper(JObject jo)
{
_jo = jo;
}
public IEnumerator<KeyValuePair<string, object>> GetEnumerator() => new JObjectWrapperEnumerator(_jo);
IEnumerator IEnumerable.GetEnumerator() => GetEnumerator();
public static object ToValue(JToken token)
{
object value;
switch (token.Type)
{
case JTokenType.Object:
value = new JObjectWrapper((JObject)token);
break;
case JTokenType.Array:
value = new JArrayWrapper((JArray)token);
break;
default:
if (token is JValue jv)
{
value = ((JValue)token).Value;
}
else
{
value = token.ToString();
}
break;
}
return value;
}
}
private class JArrayWrapper : IEnumerable
{
private JArray _ja;
public JArrayWrapper(JArray ja)
{
_ja = ja;
}
public IEnumerator GetEnumerator() => new JArrayWrapperEnumerator(_ja);
}
private class JArrayWrapperEnumerator : IEnumerator
{
private IEnumerator<JToken> _enum;
public JArrayWrapperEnumerator(JArray ja)
{
_enum = ja.GetEnumerator();
}
public object Current => JObjectWrapper.ToValue(_enum.Current);
public bool MoveNext() => _enum.MoveNext();
public void Reset() => _enum.Reset();
}
private class JObjectWrapperEnumerator : IEnumerator<KeyValuePair<string, object>>
{
private IEnumerator<KeyValuePair<string, JToken>> _enum;
public JObjectWrapperEnumerator(JObject jo)
{
_enum = jo.GetEnumerator();
}
public KeyValuePair<string, object> Current => new KeyValuePair<string, object>(_enum.Current.Key, JObjectWrapper.ToValue(_enum.Current.Value));
public bool MoveNext() => _enum.MoveNext();
public void Dispose() => _enum.Dispose();
public void Reset() => _enum.Reset();
object IEnumerator.Current => Current;
}

Have you tried using the BsonSerializer?
using MongoDB.Bson.Serialization;
[...]
var document = BsonSerializer.Deserialize<BsonDocument>(json);
BsonSerializer works with strings, so if the JSON argument is a JObject(or JArray, JRaw etc) you have to serialize it with JsonConvert.SerializeObject()

Here is an updated version of Andrew DeVries's answer that includes handling for serializing/deserializing null values.
public class JObjectSerializer : SerializerBase<JObject>
{
public override JObject Deserialize(BsonDeserializationContext context, BsonDeserializationArgs args)
{
if (context.Reader.CurrentBsonType != BsonType.Null)
{
var myBSONDoc = BsonDocumentSerializer.Instance.Deserialize(context);
return JObject.Parse(myBSONDoc.ToStrictJson());
}
else
{
context.Reader.ReadNull();
return null;
}
}
public override void Serialize(BsonSerializationContext context, BsonSerializationArgs args, JObject value)
{
if (value != null)
{
var myBSONDoc = BsonDocument.Parse(value.ToString());
BsonDocumentSerializer.Instance.Serialize(context, myBSONDoc);
}
else
{
context.Writer.WriteNull();
}
}
}
The ToStrictJson() call is an extension method that wraps the call to the built-in BSON ToJson() method to include setting the output mode to strict. If this is not done, the parsing will fail because BSON type constructors will remain in the JSON output (ObjectId(), for example).
Here is the implementation of ToStrictJson() as well:
public static class MongoExtensionMethods
{
/// <summary>
/// Create a JsonWriterSettings object to use when serializing BSON docs to JSON.
/// This will force the serializer to create valid ("strict") JSON.
/// Without this, Object IDs and Dates are ouput as {"_id": ObjectId(ds8f7s9d87f89sd9f8d9f7sd9f9s8d)}
/// and {"date": ISODate("2020-04-14 14:30:00:000")} respectively, which is not valid JSON
/// </summary>
private static JsonWriterSettings jsonWriterSettings = new JsonWriterSettings()
{
OutputMode = JsonOutputMode.Strict
};
/// <summary>
/// Custom extension method to convert MongoDB objects to JSON using the OutputMode = Strict setting.
/// This ensure that the resulting string is valid JSON.
/// </summary>
/// <typeparam name="TNominalType">The type of object to convert to JSON</typeparam>
/// <param name="obj">The object to conver to JSON</param>
/// <returns>A strict JSON string representation of obj.</returns>
public static string ToStrictJson<TNominalType>(this TNominalType obj)
{
return BsonExtensionMethods.ToJson<TNominalType>(obj, jsonWriterSettings);
}
}

I use the following. It's based on Simon's answer, thanks for the idea, and works in the same way, avoiding unnecessary serialization / deserialization into string.
It's just a bit more compact, thanks to Linq and C# 10:
public static BsonDocument ToBsonDocument(this JObject o) =>
new(o.Properties().Select(p => new BsonElement(p.Name, p.Value.ToBsonValue())));
public static BsonValue ToBsonValue(this JToken t) =>
t switch
{
JObject o => o.ToBsonDocument(),
JArray a => new BsonArray(a.Select(ToBsonValue)),
JValue v => BsonValue.Create(v.Value),
_ => throw new NotSupportedException($"ToBsonValue: {t}")
};

Most of the answers here involve serializing to and then deserializing from a string. Here is a solution that serializes to/from raw BSON instead. It requires the Newtonsoft.Json.Bson nuget package.
using System.IO;
using MongoDB.Bson.Serialization;
using MongoDB.Bson.Serialization.Serializers;
using Newtonsoft.Json;
using Newtonsoft.Json.Bson;
using Newtonsoft.Json.Linq;
namespace Zonal.EventPublisher.Worker
{
public class JObjectSerializer : SerializerBase<JObject>
{
public override JObject Deserialize(BsonDeserializationContext context, BsonDeserializationArgs args)
{
using (var stream = new MongoDB.Bson.IO.ByteBufferStream(context.Reader.ReadRawBsonDocument()))
using (JsonReader reader = new BsonDataReader(stream))
{
return JObject.Load(reader);
}
}
public override void Serialize(BsonSerializationContext context, BsonSerializationArgs args, JObject value)
{
using (var stream = new MemoryStream())
using (JsonWriter writer = new BsonDataWriter(stream))
{
value.WriteTo(writer);
var buffer = new MongoDB.Bson.IO.ByteArrayBuffer(stream.ToArray());
context.Writer.WriteRawBsonDocument(buffer);
}
}
}
}
Don't forget to register the serializer with:
BsonSerializer.RegisterSerializer(new JObjectSerializer());
After that you can convert your JObject to a BsonDocument by using the MongoDB.Bson.BsonExtensionMethods.ToBsonDocument extension method:
var myBsonDocument = myJObject.ToBsonDocument()
And convert a BsonDocument back to a JObject by using the MongoDB.Bson.Serialization.BsonSerializer class:
var myJObject = BsonSerializer.Deserialize<JObject>(myBsonDocument);

Related

Query Cosmos DB to get a list of different derived types using the .Net SDK Microsoft.Azure.Cosmos

We have an interface and a base class with multiple derived types.
public interface IEvent
{
[JsonProperty("id")]
public string Id { get; set; }
string Type { get; }
}
public abstract class EventBase: IEvent
{
public string Id { get; set; }
public abstract string Type { get; }
}
public class UserCreated : EventBase
{
public override string Type { get; } = typeof(UserCreated).AssemblyQualifiedName;
}
public class UserUpdated : EventBase
{
public override string Type { get; } = typeof(UserUpdated).AssemblyQualifiedName;
}
We are storing these events of different derived types in the same container in Cosmos DB using v3 of .Net SDK Microsoft.Azure.Cosmos. We then want to read all the events and have them deserialized to the correct type.
public class CosmosDbTests
{
[Fact]
public async Task TestFetchingDerivedTypes()
{
var endpoint = "";
var authKey = "";
var databaseId ="";
var containerId="";
var client = new CosmosClient(endpoint, authKey);
var container = client.GetContainer(databaseId, containerId);
await container.CreateItemAsync(new UserCreated{ Id = Guid.NewGuid().ToString() });
await container.CreateItemAsync(new UserUpdated{ Id = Guid.NewGuid().ToString() });
var queryable = container.GetItemLinqQueryable<IEvent>();
var query = queryable.ToFeedIterator();
var list = new List<IEvent>();
while (query.HasMoreResults)
{
list.AddRange(await query.ReadNextAsync());
}
Assert.NotEmpty(list);
}
}
Doesn't seem to be any option to tell GetItemLinqQueryable how to handle types. Is there any other method or approach to support multiple derived types in one query?
It's ok to put the events in some kind of wrapper entity if that would help, but they aren't allowed to be stored as an serialized sting inside a property.
The comment from Stephen Clearly pointed me in the right direction and with the help of this blog https://thomaslevesque.com/2019/10/15/handling-type-hierarchies-in-cosmos-db-part-2/ I ended up with a solution similar to the following example were we have a custom CosmosSerializer that uses a custom JsonConverter that reads the Type property.
public interface IEvent
{
[JsonProperty("id")]
public string Id { get; set; }
[JsonProperty("$type")]
string Type { get; }
}
public abstract class EventBase: IEvent
{
public string Id { get; set; }
public string Type => GetType().AssemblyQualifiedName;
}
public class UserCreated : EventBase
{
}
public class UserUpdated : EventBase
{
}
EventJsonConverter reads the Type property.
public class EventJsonConverter : JsonConverter
{
// This converter handles only deserialization, not serialization.
public override bool CanRead => true;
public override bool CanWrite => false;
public override bool CanConvert(Type objectType)
{
// Only if the target type is the abstract base class
return objectType == typeof(IEvent);
}
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
// First, just read the JSON as a JObject
var obj = JObject.Load(reader);
// Then look at the $type property:
var typeName = obj["$type"]?.Value<string>();
return typeName == null ? null : obj.ToObject(Type.GetType(typeName), serializer);
}
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
throw new NotSupportedException("This converter handles only deserialization, not serialization.");
}
}
The NewtonsoftJsonCosmosSerializer takes a JsonSerializerSettings that it uses for serialization.
public class NewtonsoftJsonCosmosSerializer : CosmosSerializer
{
private static readonly Encoding DefaultEncoding = new UTF8Encoding(false, true);
private readonly JsonSerializer _serializer;
public NewtonsoftJsonCosmosSerializer(JsonSerializerSettings settings)
{
_serializer = JsonSerializer.Create(settings);
}
public override T FromStream<T>(Stream stream)
{
if (typeof(Stream).IsAssignableFrom(typeof(T)))
{
return (T)(object)stream;
}
using var sr = new StreamReader(stream);
using var jsonTextReader = new JsonTextReader(sr);
return _serializer.Deserialize<T>(jsonTextReader);
}
public override Stream ToStream<T>(T input)
{
var streamPayload = new MemoryStream();
using var streamWriter = new StreamWriter(streamPayload, encoding: DefaultEncoding, bufferSize: 1024, leaveOpen: true);
using JsonWriter writer = new JsonTextWriter(streamWriter);
writer.Formatting = _serializer.Formatting;
_serializer.Serialize(writer, input);
writer.Flush();
streamWriter.Flush();
streamPayload.Position = 0;
return streamPayload;
}
}
The CosmosClient is now created with our own NewtonsoftJsonCosmosSerializer and EventJsonConverter.
public class CosmosDbTests
{
[Fact]
public async Task TestFetchingDerivedTypes()
{
var endpoint = "";
var authKey = "";
var databaseId ="";
var containerId="";
var client = new CosmosClient(endpoint, authKey, new CosmosClientOptions
{
Serializer = new NewtonsoftJsonCosmosSerializer(new JsonSerializerSettings
{
Converters = { new EventJsonConverter() }
})
});
var container = client.GetContainer(databaseId, containerId);
await container.CreateItemAsync(new UserCreated{ Id = Guid.NewGuid().ToString() });
await container.CreateItemAsync(new UserUpdated{ Id = Guid.NewGuid().ToString() });
var queryable = container.GetItemLinqQueryable<IEvent>();
var query = queryable.ToFeedIterator();
var list = new List<IEvent>();
while (query.HasMoreResults)
{
list.AddRange(await query.ReadNextAsync());
}
Assert.NotEmpty(list);
}
}

A durable entity does not deserialize

I am trying to use a durable entity in my Azure Function to cache some data. However, when I try to retrieve the entity (state) for the first time, I get an exception indicating an issue during the entity deserialization.
Here is my entity class and related code
[JsonObject(MemberSerialization.OptIn)]
public class ActionTargetIdCache : IActionTargetIdCache
{
[JsonProperty("cache")]
public Dictionary<string, ActionTargetIdsCacheItemInfo> Cache { get; set; } = new Dictionary<string, ActionTargetIdsCacheItemInfo>();
public void CacheCleanup(DateTime currentUtcTime)
{
foreach (string officeHolderId in Cache.Keys)
{
TimeSpan cacheItemAge = currentUtcTime - Cache[officeHolderId].lastUpdatedTimeStamp;
if (cacheItemAge > TimeSpan.FromMinutes(2))
{
Cache.Remove(officeHolderId);
}
}
}
public void DeleteActionTargetIds(string officeHolderId)
{
if (this.Cache.ContainsKey(officeHolderId))
{
this.Cache.Remove(officeHolderId);
}
}
public void DeleteState()
{
Entity.Current.DeleteState();
}
public void SetActionTargetIds(ActionTargetIdsCacheEntry entry)
{
this.Cache[entry.Key] = entry.Value;
}
public Task<ActionTargetIdsCacheItemInfo> GetActionTargetIdsAsync(string officeHolderId)
{
if (this.Cache.ContainsKey(officeHolderId))
{
return Task.FromResult(Cache[officeHolderId]);
}
else
{
return Task.FromResult(new ActionTargetIdsCacheItemInfo());
}
}
// public void Reset() => this.CurrentValue = 0;
// public int Get() => this.CurrentValue;
[FunctionName(nameof(ActionTargetIdCache))]
public static Task Run([EntityTrigger] IDurableEntityContext ctx)
=> ctx.DispatchAsync<ActionTargetIdCache>();
}
public class ActionTargetIdsCacheEntry
{
// officeHolderId
public string Key { get; set; } = string.Empty;
public ActionTargetIdsCacheItemInfo Value { get; set; } = new ActionTargetIdsCacheItemInfo();
}
[JsonObject(MemberSerialization.OptIn)]
public class ActionTargetIdsCacheItemInfo : ISerializable
{
public ActionTargetIdsCacheItemInfo()
{
lastUpdatedTimeStamp = DateTime.UtcNow;
actionTargetIds = new List<string>();
}
public ActionTargetIdsCacheItemInfo(SerializationInfo info, StreamingContext context)
{
lastUpdatedTimeStamp = info.GetDateTime("lastUpdated");
actionTargetIds = (List<string>)info.GetValue("actionTargetIds", typeof(List<string>));
}
[JsonProperty]
public DateTimeOffset lastUpdatedTimeStamp { get; set; } = DateTimeOffset.UtcNow;
[JsonProperty]
public List<string> actionTargetIds { get; set; } = new List<string>();
public void GetObjectData(SerializationInfo info, StreamingContext context)
{
info.AddValue("lastUpdated", lastUpdatedTimeStamp);
info.AddValue("actionTargetIds", actionTargetIds);
}
}
public interface IActionTargetIdCache
{
void CacheCleanup(DateTime currentUtcTime);
void DeleteActionTargetIds(string officeHolderId);
void DeleteState();
void SetActionTargetIds(ActionTargetIdsCacheEntry item);
// Task Reset();
Task<ActionTargetIdsCacheItemInfo> GetActionTargetIdsAsync(string officeHolderId);
// void Delete();
}
Here is the exception I get during the first attempt to access the state from an orchestration using the GetActionTargetIdsAsync method:
Exception has occurred: CLR/Microsoft.Azure.WebJobs.Extensions.DurableTask.EntitySchedulerException
Exception thrown: 'Microsoft.Azure.WebJobs.Extensions.DurableTask.EntitySchedulerException' in System.Private.CoreLib.dll: 'Failed to populate entity state from JSON: Cannot deserialize the current JSON array (e.g. [1,2,3]) into type 'PolTrack.CdbGetFunctionApp.ActionTargetIdsCacheItemInfo' because the type requires a JSON object (e.g. {"name":"value"}) to deserialize correctly.
To fix this error either change the JSON to a JSON object (e.g. {"name":"value"}) or change the deserialized type to an array or a type that implements a collection interface (e.g. ICollection, IList) like List<T> that can be deserialized from a JSON array. JsonArrayAttribute can also be added to the type to force it to deserialize from a JSON array.
Path 'cache.officeHolderId1', line 1, position 29.'
Inner exceptions found, see $exception in variables window for more details.
Innermost exception Newtonsoft.Json.JsonSerializationException : Cannot deserialize the current JSON array (e.g. [1,2,3]) into type 'PolTrack.CdbGetFunctionApp.ActionTargetIdsCacheItemInfo' because the type requires a JSON object (e.g. {"name":"value"}) to deserialize correctly.
To fix this error either change the JSON to a JSON object (e.g. {"name":"value"}) or change the deserialized type to an array or a type that implements a collection interface (e.g. ICollection, IList) like List<T> that can be deserialized from a JSON array. JsonArrayAttribute can also be added to the type to force it to deserialize from a JSON array.
Path 'cache.officeHolderId1', line 1, position 29.
Could someone with the sufficient SO privileges please add the tag azure-durable-entities.
I did manage to get around this by following #silent suggestion. I re-designed the entity class to only use CLR types. In my case, that meant replacing Dictionary<string, ActionTargetIdsCacheItemInfo> with two dictionaries Dictionary<string, List<string>> and Dictionary<string, DateTimeOffset>.

Serializing a custom subclass of NameValueCollection with Json.Net

I have the following class I am unsuccessfully attempting to serialize to Json.
class HL7 : NameValueCollection
{
public List<HL7> Children { get; set; }
public HL7()
{
Children = new List<HL7>();
}
}
I have created the object like so and added data to it:
HL7 hl7 = new HL7();
hl7.Add("a", "123");
hl7.Add("b", "456");
hl7.Children.Add(new HL7());
hl7.Children[0].Add("c", "123");
hl7.Children[0].Add("d", "456");
When I call
JsonConvert.SerializeObject(hl7)
I receive
["a","b"]
I was expecting the following:
{
"a": "123",
"b": "456",
"Children": [
{
"c": "123",
"d": "456",
}
]
}
There are a few things going on here:
Json.NET cannot serialize a NameValueCollection without a custom converter because NameValueCollection implements IEnumerable for iterating over the keys, but does not implement IDictionary for iterating over keys and values. See this answer for a fuller explanation of why this causes problems for Json.NET.
Because NameValueCollection implements IEnumerable, Json.NET sees your class as a collection, and so serializes it as a JSON array and not a JSON object with named properties. Thus, your Children are not serialized. Again, a custom converter would be required to fix this.
Assuming the above issues are resolved, if your HL7 subclass of NameValueCollection happens to have a key named "Children" you will generate invalid JSON when serializing it, namely an object with duplicated property names. I suggest moving the names & values into a nested property (named, e.g., "Values") for purposes of unambiguous serialization.
NameValueCollection actually can have multiple string values for a given key string, so its entry values need to be serialized as a JSON array not a single string.
Putting all this together, the following code:
[JsonConverter(typeof(HL7Converter))]
public class HL7 : NameValueCollection
{
public List<HL7> Children { get; set; }
public HL7()
{
Children = new List<HL7>();
}
}
public class HL7Converter : JsonConverter
{
class HL7Proxy
{
public NameValueCollectionDictionaryWrapper Values { get; set; }
public List<HL7> Children { get; set; }
}
public override bool CanConvert(Type objectType)
{
return objectType == typeof(HL7);
}
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
var proxy = serializer.Deserialize<HL7Proxy>(reader);
if (proxy == null)
return existingValue;
var hl7 = existingValue as HL7;
if (hl7 == null)
hl7 = new HL7();
hl7.Add(proxy.Values.GetCollection());
if (proxy.Children != null)
hl7.Children.AddRange(proxy.Children);
return hl7;
}
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
HL7 hl7 = (HL7)value;
if (hl7 == null)
return;
serializer.Serialize(writer, new HL7Proxy { Children = hl7.Children, Values = new NameValueCollectionDictionaryWrapper(hl7) });
}
}
// Proxy dictionary to serialize & deserialize a NameValueCollection. We use a proxy dictionary rather than a real dictionary because NameValueCollection is an ordered collection but the generic dictionary class is unordered.
public class NameValueCollectionDictionaryWrapper: IDictionary<string, string []>
{
readonly NameValueCollection collection;
public NameValueCollectionDictionaryWrapper()
: this(new NameValueCollection())
{
}
public NameValueCollectionDictionaryWrapper(NameValueCollection collection)
{
this.collection = collection;
}
// Method instead of a property to guarantee that nobody tries to serialize it.
public NameValueCollection GetCollection()
{
return collection;
}
#region IDictionary<string,string[]> Members
public void Add(string key, string[] value)
{
if (collection.GetValues(key) != null)
throw new ArgumentException("Duplicate key " + key);
foreach (var str in value)
collection.Add(key, str);
}
public bool ContainsKey(string key)
{
return collection.GetValues(key) != null;
}
public ICollection<string> Keys
{
get {
return collection.AllKeys;
}
}
public bool Remove(string key)
{
bool found = ContainsKey(key);
if (found)
collection.Remove(key);
return found;
}
public bool TryGetValue(string key, out string[] value)
{
value = collection.GetValues(key);
return value != null;
}
public ICollection<string[]> Values
{
get {
return Enumerable.Range(0, collection.Count).Select(i => collection.GetValues(i)).ToArray();
}
}
public string[] this[string key]
{
get
{
var value = collection.GetValues(key);
if (value == null)
throw new KeyNotFoundException();
return value;
}
set
{
Remove(key);
Add(key, value);
}
}
#endregion
#region ICollection<KeyValuePair<string,string[]>> Members
public void Add(KeyValuePair<string, string[]> item)
{
Add(item.Key, item.Value);
}
public void Clear()
{
collection.Clear();
}
public bool Contains(KeyValuePair<string, string[]> item)
{
string [] value;
if (!TryGetValue(item.Key, out value))
return false;
return EqualityComparer<string[]>.Default.Equals(item.Value, value); // Consistent with Dictionary<TKey, TValue>
}
public void CopyTo(KeyValuePair<string, string[]>[] array, int arrayIndex)
{
foreach (var item in this)
array[arrayIndex++] = item;
}
public int Count
{
get { return collection.Count; }
}
public bool IsReadOnly
{
get { return false; }
}
public bool Remove(KeyValuePair<string, string[]> item)
{
if (Contains(item))
return Remove(item.Key);
return false;
}
#endregion
#region IEnumerable<KeyValuePair<string,string[]>> Members
public IEnumerator<KeyValuePair<string, string[]>> GetEnumerator()
{
foreach (string key in collection)
{
yield return new KeyValuePair<string, string[]>(key, collection.GetValues(key));
}
}
#endregion
#region IEnumerable Members
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
{
return GetEnumerator();
}
#endregion
}
Using the following test case:
HL7 hl7 = new HL7();
hl7.Add("a", "123");
hl7.Add("b", "456");
hl7.Add("Children", "Children");
hl7.Children.Add(new HL7());
hl7.Children[0].Add("c", "123");
hl7.Children[0].Add("d", "456");
hl7.Children[0].Add("d", "789");
var json = JsonConvert.SerializeObject(hl7, Formatting.Indented);
Debug.WriteLine(json);
Gives the following JSON:
{
"Values": {
"a": [
"123"
],
"b": [
"456"
],
"Children": [
"Children"
]
},
"Children": [
{
"Values": {
"c": [
"123"
],
"d": [
"456",
"789"
]
},
"Children": []
}
]
}
Inspired by this answer
how to convert NameValueCollection to JSON string?
, here is the working code (the only bad part is probably the "Children" string that is the property name. If you'll do a refactor, this will cause an error.
JsonConvert.SerializeObject(NvcToDictionary(hl7, false));
And the function:
static Dictionary<string, object> NvcToDictionary(HL7 nvc, bool handleMultipleValuesPerKey)
{
var result = new Dictionary<string, object>();
foreach (string key in nvc.Keys)
{
if (handleMultipleValuesPerKey)
{
string[] values = nvc.GetValues(key);
if (values.Length == 1)
{
result.Add(key, values[0]);
}
else
{
result.Add(key, values);
}
}
else
{
result.Add(key, nvc[key]);
}
}
if (nvc.Children.Any())
{
var listOfChildrenDictionary = new List<Dictionary<string, object>>();
foreach (var nvcChildren in nvc.Children){
listOfChildrenDictionary.Add(NvcToDictionary(nvcChildren, false));
}
result.Add("Children", listOfChildrenDictionary);
}
return result;
}
I have had issues with serializing NameValueCollections, using JSON.Net, The only way I have found is to convert it to a dictionary and then serialize it like:
var jsonString = JsonConvert.SerializeObject(new
{
Parent = hl7.AllKeys.ToDictionary(r => r, r => hl7[r]),
Children = hl7.Children.Select(c => c.AllKeys.ToDictionary(sub => sub, sub => c[sub]))
}, Newtonsoft.Json.Formatting.Indented);
and you will end up with:
{
"Parent": {
"a": "123",
"b": "456"
},
"Children": [
{
"c": "123",
"d": "456"
}
]
}
But this will return "Parent" as well for top level items, since you have to specify a name for property in anonymous type
Here's a custom serializer that will write the JSON as you were looking for, example program attached. The serializer is at the bottom. Note that you will need to add this converter to the JSON serializer settings, either through the default as I've done, or through the constructor of your serializer. Alternately, since you have a subclass you can use the JsonConverterAttribute on the HL7 class to assign the serializer
public class Program
{
static int Main(string[] args) {
JsonConvert.DefaultSettings = () => new JsonSerializerSettings {
Converters = new []{ new HL7Converter() }
};
HL7 hl7 = new HL7();
hl7.Add("a", "123");
hl7.Add("b", "456");
hl7.Children.Add(new HL7());
hl7.Children[0].Add("c", "123");
hl7.Children[0].Add("d", "456");
Console.WriteLine (JsonConvert.SerializeObject (hl7));
return 0;
}
}
public class HL7 : NameValueCollection
{
public List<HL7> Children { get; set; }
public HL7()
{
Children = new List<HL7> ();
}
}
public class HL7Converter : Newtonsoft.Json.JsonConverter {
#region implemented abstract members of JsonConverter
public override void WriteJson (Newtonsoft.Json.JsonWriter writer, object value, Newtonsoft.Json.JsonSerializer serializer)
{
var collection = (HL7)value;
writer.WriteStartObject ();
foreach (var key in collection.AllKeys) {
writer.WritePropertyName (key);
writer.WriteValue (collection [key]);
}
writer.WritePropertyName ("Children");
serializer.Serialize (writer,collection.Children);
writer.WriteEndObject ();
}
public override object ReadJson (Newtonsoft.Json.JsonReader reader, Type objectType, object existingValue, Newtonsoft.Json.JsonSerializer serializer)
{
HL7 collection = existingValue as HL7 ?? new HL7 ();
JObject jObj = JObject.Load (reader);
foreach (var prop in jObj.Properties()) {
if (prop.Name != "Children") {
collection.Add (prop.Name, prop.Value.ToObject<string> ());
} else {
collection.Children = jObj.ToObject<List<HL7>> ();
}
}
return collection;
}
public override bool CanConvert (Type objectType)
{
return objectType == typeof(HL7);
}
#endregion
}

JSON.NET Deserialize objects within object / array of objects

I have a situation where an API I'm using is returning inconsistent JSON, which I want to deserialize using JSON.NET. In one case, it returns an object that contains objects (note that the outer "1" can be any number):
{
"1":{
"0":{
"db_id":"12835424",
"title":"XXX"
},
"1":{
"db_id":"12768978",
"title":"YYY"
},
"2":{
"db_id":"12768980",
"title":"ZZZ"
},
"3":{
"db_id":"12768981",
"title":"PPP"
}
}
}
And in another case, it returns an array of objects:
{
"3":[
{
"db_id":"12769199",
"title":"XXX"
},
{
"db_id":"12769200",
"title":"YYY"
},
{
"db_id":"12769202",
"title":"ZZZ"
},
{
"db_id":"12769243",
"title":"PPP"
}
]
}
I have no idea why this inconsistency exists, but this is the format I'm working with. What would be the correct way to deserialize both formats with the JsonConvert.DeserializeObject method?
With the current version of Json.NET (Json.NET 4.5 Release 11), here is a CustomCreationConverter that will handle Json that deserializes sometimes as an object and sometimes as an array.
public class ObjectToArrayConverter<T> : CustomCreationConverter<List<T>> where T : new()
{
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
List<T> target = new List<T>();
try
{
// Load JObject from stream
JArray jArray = JArray.Load(reader);
// Populate the object properties
serializer.Populate(jArray.CreateReader(), target);
}
catch (JsonReaderException)
{
// Handle case when object is not an array...
// Load JObject from stream
JObject jObject = JObject.Load(reader);
// Create target object based on JObject
T t = new T();
// Populate the object properties
serializer.Populate(jObject.CreateReader(), t);
target.Add(t);
}
return target;
}
public override List<T> Create(Type objectType)
{
return new List<T>();
}
}
Example Usage:
[JsonObject]
public class Project
{
[JsonProperty]
public string id { get; set; }
// The Json for this property sometimes comes in as an array of task objects,
// and sometimes it is just a single task object.
[JsonProperty]
[JsonConverter(typeof(ObjectToArrayConverter<Task>))]
public List<Task> tasks{ get; set; }
}
[JsonObject]
public class Task
{
[JsonProperty]
public string name { get; set; }
[JsonProperty]
public DateTime due { get; set; }
}
I think this is something that should be possible by creating a JsonCreationConverter. This article can probably help out: http://dotnetbyexample.blogspot.nl/2012/02/json-deserialization-with-jsonnet-class.html

Best way to trim strings after data entry. Should I create a custom model binder?

I'm using ASP.NET MVC and I'd like all user entered string fields to be trimmed before they're inserted into the database. And since I have many data entry forms, I'm looking for an elegant way to trim all strings instead of explicitly trimming every user supplied string value. I'm interested to know how and when people are trimming strings.
I thought about perhaps creating a custom model binder and trimming any string values there...that way, all my trimming logic is contained in one place. Is this a good approach? Are there any code samples that do this?
public class TrimModelBinder : DefaultModelBinder
{
protected override void SetProperty(ControllerContext controllerContext,
ModelBindingContext bindingContext,
System.ComponentModel.PropertyDescriptor propertyDescriptor, object value)
{
if (propertyDescriptor.PropertyType == typeof(string))
{
var stringValue = (string)value;
if (!string.IsNullOrWhiteSpace(stringValue))
{
value = stringValue.Trim();
}
else
{
value = null;
}
}
base.SetProperty(controllerContext, bindingContext,
propertyDescriptor, value);
}
}
How about this code?
ModelBinders.Binders.DefaultBinder = new TrimModelBinder();
Set global.asax Application_Start event.
This is #takepara same resolution but as an IModelBinder instead of DefaultModelBinder so that adding the modelbinder in global.asax is through
ModelBinders.Binders.Add(typeof(string),new TrimModelBinder());
The class:
public class TrimModelBinder : IModelBinder
{
public object BindModel(ControllerContext controllerContext,
ModelBindingContext bindingContext)
{
ValueProviderResult valueResult = bindingContext.ValueProvider.GetValue(bindingContext.ModelName);
if (valueResult== null || valueResult.AttemptedValue==null)
return null;
else if (valueResult.AttemptedValue == string.Empty)
return string.Empty;
return valueResult.AttemptedValue.Trim();
}
}
based on #haacked post:
http://haacked.com/archive/2011/03/19/fixing-binding-to-decimals.aspx
One improvement to #takepara answer.
Somewere in project:
public class NoTrimAttribute : Attribute { }
In TrimModelBinder class change
if (propertyDescriptor.PropertyType == typeof(string))
to
if (propertyDescriptor.PropertyType == typeof(string) && !propertyDescriptor.Attributes.Cast<object>().Any(a => a.GetType() == typeof(NoTrimAttribute)))
and you can mark properties to be excluded from trimming with [NoTrim] attribute.
In ASP.Net Core 2 this worked for me. I'm using the [FromBody] attribute in my controllers and JSON input. To override the string handling in the JSON deserialization I registered my own JsonConverter:
services.AddMvcCore()
.AddJsonOptions(options =>
{
options.SerializerSettings.Converters.Insert(0, new TrimmingStringConverter());
})
And this is the converter:
public class TrimmingStringConverter : JsonConverter
{
public override bool CanRead => true;
public override bool CanWrite => false;
public override bool CanConvert(Type objectType) => objectType == typeof(string);
public override object ReadJson(JsonReader reader, Type objectType,
object existingValue, JsonSerializer serializer)
{
if (reader.Value is string value)
{
return value.Trim();
}
return reader.Value;
}
public override void WriteJson(JsonWriter writer, object value,
JsonSerializer serializer)
{
throw new NotImplementedException();
}
}
With improvements in C# 6, you can now write a very compact model binder that will trim all string inputs:
public class TrimStringModelBinder : IModelBinder
{
public object BindModel(ControllerContext controllerContext, ModelBindingContext bindingContext)
{
var value = bindingContext.ValueProvider.GetValue(bindingContext.ModelName);
var attemptedValue = value?.AttemptedValue;
return string.IsNullOrWhiteSpace(attemptedValue) ? attemptedValue : attemptedValue.Trim();
}
}
You need to include this line somewhere in Application_Start() in your Global.asax.cs file to use the model binder when binding strings:
ModelBinders.Binders.Add(typeof(string), new TrimStringModelBinder());
I find it is better to use a model binder like this, rather than overriding the default model binder, because then it will be used whenever you are binding a string, whether that's directly as a method argument or as a property on a model class. However, if you override the default model binder as other answers here suggest, that will only work when binding properties on models, not when you have a string as an argument to an action method
Edit: a commenter asked about dealing with the situation when a field should not be validated. My original answer was reduced to deal just with the question the OP had posed, but for those who are interested, you can deal with validation by using the following extended model binder:
public class TrimStringModelBinder : IModelBinder
{
public object BindModel(ControllerContext controllerContext, ModelBindingContext bindingContext)
{
var shouldPerformRequestValidation = controllerContext.Controller.ValidateRequest && bindingContext.ModelMetadata.RequestValidationEnabled;
var unvalidatedValueProvider = bindingContext.ValueProvider as IUnvalidatedValueProvider;
var value = unvalidatedValueProvider == null ?
bindingContext.ValueProvider.GetValue(bindingContext.ModelName) :
unvalidatedValueProvider.GetValue(bindingContext.ModelName, !shouldPerformRequestValidation);
var attemptedValue = value?.AttemptedValue;
return string.IsNullOrWhiteSpace(attemptedValue) ? attemptedValue : attemptedValue.Trim();
}
}
Another variant of #takepara's answer but with a different twist:
1) I prefer the opt-in "StringTrim" attribute mechanism (rather than the opt-out "NoTrim" example of #Anton).
2) An additional call to SetModelValue is required to ensure the ModelState is populated correctly and the default validation/accept/reject pattern can be used as normal, i.e. TryUpdateModel(model) to apply and ModelState.Clear() to accept all changes.
Put this in your entity/shared library:
/// <summary>
/// Denotes a data field that should be trimmed during binding, removing any spaces.
/// </summary>
/// <remarks>
/// <para>
/// Support for trimming is implmented in the model binder, as currently
/// Data Annotations provides no mechanism to coerce the value.
/// </para>
/// <para>
/// This attribute does not imply that empty strings should be converted to null.
/// When that is required you must additionally use the <see cref="System.ComponentModel.DataAnnotations.DisplayFormatAttribute.ConvertEmptyStringToNull"/>
/// option to control what happens to empty strings.
/// </para>
/// </remarks>
[AttributeUsage(AttributeTargets.Property | AttributeTargets.Field, AllowMultiple = false)]
public class StringTrimAttribute : Attribute
{
}
Then this in your MVC application/library:
/// <summary>
/// MVC model binder which trims string values decorated with the <see cref="StringTrimAttribute"/>.
/// </summary>
public class StringTrimModelBinder : IModelBinder
{
/// <summary>
/// Binds the model, applying trimming when required.
/// </summary>
public object BindModel(ControllerContext controllerContext, ModelBindingContext bindingContext)
{
// Get binding value (return null when not present)
var propertyName = bindingContext.ModelName;
var originalValueResult = bindingContext.ValueProvider.GetValue(propertyName);
if (originalValueResult == null)
return null;
var boundValue = originalValueResult.AttemptedValue;
// Trim when required
if (!String.IsNullOrEmpty(boundValue))
{
// Check for trim attribute
if (bindingContext.ModelMetadata.ContainerType != null)
{
var property = bindingContext.ModelMetadata.ContainerType.GetProperties()
.FirstOrDefault(propertyInfo => propertyInfo.Name == bindingContext.ModelMetadata.PropertyName);
if (property != null && property.GetCustomAttributes(true)
.OfType<StringTrimAttribute>().Any())
{
// Trim when attribute set
boundValue = boundValue.Trim();
}
}
}
// Register updated "attempted" value with the model state
bindingContext.ModelState.SetModelValue(propertyName, new ValueProviderResult(
originalValueResult.RawValue, boundValue, originalValueResult.Culture));
// Return bound value
return boundValue;
}
}
If you don't set the property value in the binder, even when you don't want to change anything, you will block that property from ModelState altogether! This is because you are registered as binding all string types, so it appears (in my testing) that the default binder will not do it for you then.
Extra info for anyone searching how to do this in ASP.NET Core 1.0. Logic has changed quite a lot.
I wrote a blog post about how to do it, it explains things in bit more detailed
So ASP.NET Core 1.0 solution:
Model binder to do the actual trimming
public class TrimmingModelBinder : ComplexTypeModelBinder
{
public TrimmingModelBinder(IDictionary propertyBinders) : base(propertyBinders)
{
}
protected override void SetProperty(ModelBindingContext bindingContext, string modelName, ModelMetadata propertyMetadata, ModelBindingResult result)
{
if(result.Model is string)
{
string resultStr = (result.Model as string).Trim();
result = ModelBindingResult.Success(resultStr);
}
base.SetProperty(bindingContext, modelName, propertyMetadata, result);
}
}
Also you need Model Binder Provider in the latest version, this tells that should this binder be used for this model
public class TrimmingModelBinderProvider : IModelBinderProvider
{
public IModelBinder GetBinder(ModelBinderProviderContext context)
{
if (context == null)
{
throw new ArgumentNullException(nameof(context));
}
if (context.Metadata.IsComplexType && !context.Metadata.IsCollectionType)
{
var propertyBinders = new Dictionary();
foreach (var property in context.Metadata.Properties)
{
propertyBinders.Add(property, context.CreateBinder(property));
}
return new TrimmingModelBinder(propertyBinders);
}
return null;
}
}
Then it has to be registered in Startup.cs
services.AddMvc().AddMvcOptions(options => {
options.ModelBinderProviders.Insert(0, new TrimmingModelBinderProvider());
});
In case of MVC Core
Binder:
using Microsoft.AspNetCore.Mvc.ModelBinding;
using System;
using System.Threading.Tasks;
public class TrimmingModelBinder
: IModelBinder
{
private readonly IModelBinder FallbackBinder;
public TrimmingModelBinder(IModelBinder fallbackBinder)
{
FallbackBinder = fallbackBinder ?? throw new ArgumentNullException(nameof(fallbackBinder));
}
public Task BindModelAsync(ModelBindingContext bindingContext)
{
if (bindingContext == null)
{
throw new ArgumentNullException(nameof(bindingContext));
}
var valueProviderResult = bindingContext.ValueProvider.GetValue(bindingContext.ModelName);
if (valueProviderResult != null &&
valueProviderResult.FirstValue is string str &&
!string.IsNullOrEmpty(str))
{
bindingContext.Result = ModelBindingResult.Success(str.Trim());
return Task.CompletedTask;
}
return FallbackBinder.BindModelAsync(bindingContext);
}
}
Provider:
using Microsoft.AspNetCore.Mvc.ModelBinding;
using Microsoft.AspNetCore.Mvc.ModelBinding.Binders;
using System;
public class TrimmingModelBinderProvider
: IModelBinderProvider
{
public IModelBinder GetBinder(ModelBinderProviderContext context)
{
if (context == null)
{
throw new ArgumentNullException(nameof(context));
}
if (!context.Metadata.IsComplexType && context.Metadata.ModelType == typeof(string))
{
return new TrimmingModelBinder(new SimpleTypeModelBinder(context.Metadata.ModelType));
}
return null;
}
}
Registration function:
public static void AddStringTrimmingProvider(this MvcOptions option)
{
var binderToFind = option.ModelBinderProviders
.FirstOrDefault(x => x.GetType() == typeof(SimpleTypeModelBinderProvider));
if (binderToFind == null)
{
return;
}
var index = option.ModelBinderProviders.IndexOf(binderToFind);
option.ModelBinderProviders.Insert(index, new TrimmingModelBinderProvider());
}
Register:
service.AddMvc(option => option.AddStringTrimmingProvider())
I created value providers to trim the query string parameter values and the form values. This was tested with ASP.NET Core 3 and works perfectly.
public class TrimmedFormValueProvider
: FormValueProvider
{
public TrimmedFormValueProvider(IFormCollection values)
: base(BindingSource.Form, values, CultureInfo.InvariantCulture)
{ }
public override ValueProviderResult GetValue(string key)
{
ValueProviderResult baseResult = base.GetValue(key);
string[] trimmedValues = baseResult.Values.Select(v => v?.Trim()).ToArray();
return new ValueProviderResult(new StringValues(trimmedValues));
}
}
public class TrimmedQueryStringValueProvider
: QueryStringValueProvider
{
public TrimmedQueryStringValueProvider(IQueryCollection values)
: base(BindingSource.Query, values, CultureInfo.InvariantCulture)
{ }
public override ValueProviderResult GetValue(string key)
{
ValueProviderResult baseResult = base.GetValue(key);
string[] trimmedValues = baseResult.Values.Select(v => v?.Trim()).ToArray();
return new ValueProviderResult(new StringValues(trimmedValues));
}
}
public class TrimmedFormValueProviderFactory
: IValueProviderFactory
{
public Task CreateValueProviderAsync(ValueProviderFactoryContext context)
{
if (context.ActionContext.HttpContext.Request.HasFormContentType)
context.ValueProviders.Add(new TrimmedFormValueProvider(context.ActionContext.HttpContext.Request.Form));
return Task.CompletedTask;
}
}
public class TrimmedQueryStringValueProviderFactory
: IValueProviderFactory
{
public Task CreateValueProviderAsync(ValueProviderFactoryContext context)
{
context.ValueProviders.Add(new TrimmedQueryStringValueProvider(context.ActionContext.HttpContext.Request.Query));
return Task.CompletedTask;
}
}
Then register the value provider factories in the ConfigureServices() function in Startup.cs
services.AddControllersWithViews(options =>
{
int formValueProviderFactoryIndex = options.ValueProviderFactories.IndexOf(options.ValueProviderFactories.OfType<FormValueProviderFactory>().Single());
options.ValueProviderFactories[formValueProviderFactoryIndex] = new TrimmedFormValueProviderFactory();
int queryStringValueProviderFactoryIndex = options.ValueProviderFactories.IndexOf(options.ValueProviderFactories.OfType<QueryStringValueProviderFactory>().Single());
options.ValueProviderFactories[queryStringValueProviderFactoryIndex] = new TrimmedQueryStringValueProviderFactory();
});
While reading through the excellent answers and comments above, and becoming increasingly confused, I suddenly thought, hey, I wonder if there's a jQuery solution. So for others who, like me, find ModelBinders a bit bewildering, I offer the following jQuery snippet that trims the input fields before the form gets submitted.
$('form').submit(function () {
$(this).find('input:text').each(function () {
$(this).val($.trim($(this).val()));
})
});
Late to the party, but the following is a summary of adjustments required for MVC 5.2.3 if you are to handle the skipValidation requirement of the build-in value providers.
public class TrimStringModelBinder : IModelBinder
{
public object BindModel(ControllerContext controllerContext, ModelBindingContext bindingContext)
{
// First check if request validation is required
var shouldPerformRequestValidation = controllerContext.Controller.ValidateRequest &&
bindingContext.ModelMetadata.RequestValidationEnabled;
// determine if the value provider is IUnvalidatedValueProvider, if it is, pass in the
// flag to perform request validation (e.g. [AllowHtml] is set on the property)
var unvalidatedProvider = bindingContext.ValueProvider as IUnvalidatedValueProvider;
var valueProviderResult = unvalidatedProvider?.GetValue(bindingContext.ModelName, !shouldPerformRequestValidation) ??
bindingContext.ValueProvider.GetValue(bindingContext.ModelName);
return valueProviderResult?.AttemptedValue?.Trim();
}
}
Global.asax
protected void Application_Start()
{
...
ModelBinders.Binders.Add(typeof(string), new TrimStringModelBinder());
...
}
Update: This answer is out of date for recent versions of ASP.NET Core. Use Bassem's answer instead.
For ASP.NET Core, replace the ComplexTypeModelBinderProvider with a provider that trims strings.
In your startup code ConfigureServices method, add this:
services.AddMvc()
.AddMvcOptions(s => {
s.ModelBinderProviders[s.ModelBinderProviders.TakeWhile(p => !(p is ComplexTypeModelBinderProvider)).Count()] = new TrimmingModelBinderProvider();
})
Define TrimmingModelBinderProvider like this:
/// <summary>
/// Used in place of <see cref="ComplexTypeModelBinderProvider"/> to trim beginning and ending whitespace from user input.
/// </summary>
class TrimmingModelBinderProvider : IModelBinderProvider
{
class TrimmingModelBinder : ComplexTypeModelBinder
{
public TrimmingModelBinder(IDictionary<ModelMetadata, IModelBinder> propertyBinders) : base(propertyBinders) { }
protected override void SetProperty(ModelBindingContext bindingContext, string modelName, ModelMetadata propertyMetadata, ModelBindingResult result)
{
var value = result.Model as string;
if (value != null)
result = ModelBindingResult.Success(value.Trim());
base.SetProperty(bindingContext, modelName, propertyMetadata, result);
}
}
public IModelBinder GetBinder(ModelBinderProviderContext context)
{
if (context.Metadata.IsComplexType && !context.Metadata.IsCollectionType) {
var propertyBinders = new Dictionary<ModelMetadata, IModelBinder>();
for (var i = 0; i < context.Metadata.Properties.Count; i++) {
var property = context.Metadata.Properties[i];
propertyBinders.Add(property, context.CreateBinder(property));
}
return new TrimmingModelBinder(propertyBinders);
}
return null;
}
}
The ugly part of this is the copy and paste of the GetBinder logic from ComplexTypeModelBinderProvider, but there doesn't seem to be any hook to let you avoid this.
I disagree with the solution.
You should override GetPropertyValue because the data for SetProperty could also be filled by the ModelState.
To catch the raw data from the input elements write this:
public class CustomModelBinder : System.Web.Mvc.DefaultModelBinder
{
protected override object GetPropertyValue(System.Web.Mvc.ControllerContext controllerContext, System.Web.Mvc.ModelBindingContext bindingContext, System.ComponentModel.PropertyDescriptor propertyDescriptor, System.Web.Mvc.IModelBinder propertyBinder)
{
object value = base.GetPropertyValue(controllerContext, bindingContext, propertyDescriptor, propertyBinder);
string retval = value as string;
return string.IsNullOrWhiteSpace(retval)
? value
: retval.Trim();
}
}
Filter by propertyDescriptor PropertyType if you are really only interested in string values but it should not matter because everything what comes in is basically a string.
There have been a lot of posts suggesting an attribute approach. Here is a package that already has a trim attribute and many others: Dado.ComponentModel.Mutations or NuGet
public partial class ApplicationUser
{
[Trim, ToLower]
public virtual string UserName { get; set; }
}
// Then to preform mutation
var user = new ApplicationUser() {
UserName = " M#X_speed.01! "
}
new MutationContext<ApplicationUser>(user).Mutate();
After the call to Mutate(), user.UserName will be mutated to m#x_speed.01!.
This example will trim whitespace and case the string to lowercase. It doesn't introduce validation, but the System.ComponentModel.Annotations can be used alongside Dado.ComponentModel.Mutations.
I posted this in another thread. In asp.net core 2, I went in a different direction. I used an action filter instead. In this case the developer can either set it globally or use as an attribute for the actions he/she wants to apply the string trimming. This code runs after the model binding has taken place, and it can update the values in the model object.
Here is my code, first create an action filter:
public class TrimInputStringsAttribute : ActionFilterAttribute
{
public override void OnActionExecuting(ActionExecutingContext context)
{
foreach (var arg in context.ActionArguments)
{
if (arg.Value is string)
{
string val = arg.Value as string;
if (!string.IsNullOrEmpty(val))
{
context.ActionArguments[arg.Key] = val.Trim();
}
continue;
}
Type argType = arg.Value.GetType();
if (!argType.IsClass)
{
continue;
}
TrimAllStringsInObject(arg.Value, argType);
}
}
private void TrimAllStringsInObject(object arg, Type argType)
{
var stringProperties = argType.GetProperties()
.Where(p => p.PropertyType == typeof(string));
foreach (var stringProperty in stringProperties)
{
string currentValue = stringProperty.GetValue(arg, null) as string;
if (!string.IsNullOrEmpty(currentValue))
{
stringProperty.SetValue(arg, currentValue.Trim(), null);
}
}
}
}
To use it, either register as global filter or decorate your actions with the TrimInputStrings attribute.
[TrimInputStrings]
public IActionResult Register(RegisterViewModel registerModel)
{
// Some business logic...
return Ok();
}
OK, I have this thing and it kinda works:
class TrimmingModelBinder : IModelBinder
{
public Task BindModelAsync (ModelBindingContext ctx)
{
if
(
ctx .ModelName is string name
&& ctx .ValueProvider .GetValue (name) .FirstValue is string v)
ctx .ModelState .SetModelValue
(
name,
new ValueProviderResult
((ctx .Result = ModelBindingResult .Success (v .Trim ())) .Model as string));
return Task .CompletedTask; }}
class AutoTrimAttribute : ModelBinderAttribute
{
public AutoTrimAttribute ()
{ this .BinderType = typeof (TrimmingModelBinder); }}
It is a shame that there is no standard feature for this though.
I adapted #Kai G's answer for System.Text.Json:
using System;
using System.Text.Json;
using System.Text.Json.Serialization;
public class TrimmedStringConverter : JsonConverter<string>
{
public override bool CanConvert(Type typeToConvert) => typeToConvert == typeof(string);
public override string Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
{
return reader.GetString() is string value ? value.Trim() : null;
}
public override void Write(Utf8JsonWriter writer, string value, JsonSerializerOptions options)
{
writer.WriteStringValue(value);
}
}

Resources