Create reference JSON data from JSON schema using NewtonSoft - json.net

I would like to be able to dynamically create valid reference data using a JSON schema, I'm not sure if there is something built into NewtonSoft.JSON for this or not, or if there is some combination of classes and methods I could use to easily generate the appropriate JSON.
I've been tinkering with this over the past few days and this is what I have thus far.
CreateJson
static void CreateJson(KeyValuePair<string, JSchema> jsonProperty, JObject jObject)
{
switch (jsonProperty.Value.Type)
{
case JSchemaType.Array:
jObject.Add(new JProperty(jsonProperty.Key, new JArray()));
foreach (var jItem in jsonProperty.Value.Items)
{
foreach (var jProp in jItem.Properties)
{
CreateJson(jProp, jObject);
}
}
break;
case JSchemaType.Object:
JObject nestedObject = new JObject();
foreach (var jProp in jsonProperty.Value.Properties)
{
CreateJson(jProp, nestedObject);
}
jObject.Add(new JProperty(jsonProperty.Key, nestedObject));
break;
default:
jObject.Add(new JProperty(jsonProperty.Key, jsonProperty.Value.Default));
break;
}
}
It gets called like this
example
JSchema schema = JSchema.Parse(jsonSchema);
JObject jsonObject = new JObject();
foreach (var prop in schema.Properties)
{
CreateJson(prop, jsonObject);
}
The output is a little wonky though
{{
"checked": false,
"dimensions": {
"width": 0,
"height": 0
},
"id": 0,
"name": "",
"price": 0,
"tags": []
}}
First it appears there is the double braces, which I'm not 100% certain where they came from. The next is that Dimensions is an object with Width and Height as properties, they are not nested inside the Dimensions object. For this POC I know exactly what and where they should be, but in production the code won't "know" anything about the schema coming in. I feel I'll have a similar problem with the array, but not sure that actually worked as the array is just strings, and I was more interested in the object.
I was looking at this example on Newtonsoft.com but again that sample knows exactly where the thing is they want to find. I am almost wondering if when the code encounters an object it should handle it differently...or perhaps the default switch that is capturing the individual properties is wrong.
Any advice is again very much appreciated.

Related

Asp.Net Controller action returning unexpected JSON data

I'm looking into some old code and I am seeing something I cannot figure out. The code is a controller action that returns a dynamic object:
return new
{
Result = true,
Count = data.Count(),
Students = data.Select(s => string.Format("{0}, {1}", s.LastName, s.FirstName))
};
However, the resulting JSON in the browser is not coming back as I would expect:
{
"$id":"1",
"Result":true,
"Count":1,
"Students":
{
"$id":"2",
"$values":["USER, ACTIVE"]
}
}
What I would expect, and what I normally get any other time I do this sort of thing, is more like this:
{
"Result":true,
"Count":1,
"Students":
{
["USER, ACTIVE"]
}
}
I have no idea where the $id and $values properties are coming from. I haven't seen this happen before with .Net, so I'm not sure what is causing this. It's not the dynamic object return that's causing the problem because I switched it to a named type just to test it out and it still does the same thing.
You need add this line of code to Global.asax to avoid append $id
GlobalConfiguration.Configuration.Formatters.JsonFormatter.SerializerSettings.PreserveReferencesHandling
= Newtonsoft.Json.PreserveReferencesHandling.None;
You need to have a .ToList() at the end of the students.
{
Result = true,
Count = data.Count(),
Students = data.Select(s => string.Format("{0}, {1}", s.LastName, s.FirstName)).ToList()
};

Combine Multiple JSON Files Json.NET

I have an API that currently receives JSON calls that I push to files (800KB-1MB) (1 for each call), and would like to have an hourly task that takes all of the JSON files in the last hour and combines them into a single file as to make it better to do daily/monthly analytics on.
Each file consists of a collection of data, so in the format of [ object {property: value, ... ]. Due to this, I cannot do simple concatenation as it'll no longer be valid JSON (nor add a comma then the file will be a collection of collections). I would like to keep the memory foot-print as low as possible, so I was looking at the following example and just pushing each file to the stream (deserializing the file using JsonConvert.DeserializeObject(fileContent); however, by doing this, I end up with a collection of collection as well. I have also tried using a JArray instead of the JsonConvert, pushing to a list outside of the foreach with but provides the same result. If I move the Serialize call outside the ForEach, it does work; however, I am worried about holding the 4-6GB worth of items in memory.
In summary, I'm ending up with [ [ object {property: value, ... ],... [ object {property: value, ... ]] where my desired output would be [ object {property: value (file1), ... object {property: value (fileN) ].
using (FileStream fs = File.Open(#"C:\Users\Public\Documents\combined.json", FileMode.CreateNew))
{
using (StreamWriter sw = new StreamWriter(fs))
{
using (JsonWriter jw = new JsonTextWriter(sw))
{
jw.Formatting = Formatting.None;
JArray list = new JArray();
JsonSerializer serializer = new JsonSerializer();
foreach (IListBlobItem blob in blobContainer.ListBlobs(prefix: "SharePointBlobs/"))
{
if (blob.GetType() == typeof(CloudBlockBlob))
{
var blockBlob = (CloudBlockBlob)blob;
var content = blockBlob.DownloadText();
var deserialized = JArray.Parse(content);
//deserialized = JsonConvert.DeserializeObject(content);
list.Merge(deserialized);
serializer.Serialize(jw, list);
}
else
{
Console.WriteLine("Non-Block-Blob: " + blob.StorageUri);
}
}
}
}
}
In this situation, to keep your processing and memory footprints low, I think I would just concatenate the files one after the other even though it results in technically invalid JSON. To deserialize the combined file later, you can take advantage of the SupportMultipleContent setting on the JsonTextReader class and process the object collections through a stream as if they were one whole collection. See this answer for an example of how to do this.

Getting property values from dynamic

I use Dapper with dynamics because the table to be queries is not known until runtime, so POCO classes aren't possible.
I am returning Dapper's results via WebAPI. To save bandwidth, I need to return just the values from Dapper, not the property names, e.g.:
{
7,"2013-10-01T00:00:00",0,"AC",null,"ABC","SOMESTAGE"
},...
And not:
{
TID: 7,
CHANGE_DT: "2013-10-01T00:00:00",
EFFECTIVE_APPTRANS: 0,
EFFECTIVE_APPTRANS_STATUS: "AC",
DEVICE: null,
PROCESS: "ABC",
STAGE: "SOMESTAGE"
},...
I'm having some trouble figuring out a reasonable way to do this. I've tried abusing Dapper's mapping feature:
var tableData = Database.Query<dynamic, dynamic, dynamic>(connectInfo, someResource.sql,
(x, y) =>
{
List<object> l = new List<object>();
object o = x;
foreach(var propertyName in o.GetType().GetProperties().Select(p => p.Name)) {
object value = o.GetType().GetProperty(propertyName).GetValue(o, null);
l.Add(value);
}
return l as dynamic;
},
new {implantId, pendingApptrans},
splitOn: "the_last_column");
I've also tried having Dapper return a List using the same base code.
The idea was that I'd extract property values within the map function because it allows me to play with rows before anything is returned, but I get empty results without error:
[
[ ], [ ], [ ], [ ], [ ], [ ], [ ]
]
Additionally, I don't know any column names to split on, which the mapping feature wants. However even when I enter the last column name from a test query, the results are the same.
How can I return only values from Dapper's dynamic return value?
Do I need to resort to post-processing the dynamic after the Dapper call?
For that particular set of requirements, yes you would need to post-process; for example, you could cast to DapperRow or IDictionary<string,object>. Alternatively, you could use the IDataReader API that dapper now exposes.

In Meteor, how to remove items from a non-Mongo Collection?

In Meteor, I am publishing a collection from a non-Mongo source (IMAP specifically).
Meteor.publish("search_results", function(user, password, str) {
var self = this;
res_msg = [];
Imap.connect({... });
Imap.search(str, resultcb);
for (var i = 0; i < res_msg.length; i++) {
self.set("s_results", Meteor.uuid(), {uid: res_msg[i].uid, date: res_msg[i].date, headers:res_msg[i].headers});
}
self.flush();
self.complete();
self.flush();
console.log("total messages : ", res_msg.length);
});
This works fine. However, on the second pass though, new items are appended to the collection. There does not appear to be a way to remove records from a non-Mongo collection.
It seems from the documentation that if I use this.unset, it will change attributes, not remove the record(s).
I can't use collection.remove({}) either client or server side.
I found a really ugly way to do this, so I'm leaving the question open in the hopes that there is a better alternative.
Basically, if you unset all the attributes, the document goes away. The question is how to iterate over the collection within the publish method to find all documents so attributes can be unset. Creating a collection doesn't seem to work, let alone .find();
I stored the list of ids in a separate array. Ugly, I know. I hope you can do better.
for (i = 0; i < uuids.length; i++) {
self.unset("s_results", uuids[i], {});
}
uuids = [];
Imap.search(str, resultcb);
for (var i = 0; i < res_msg.length; i++) {
var u = Meteor.uuid();
self.set("s_results", u, {uid: res_msg[i].uid, date: res_msg[i].date, headers:res_msg[i].headers});
uuids.push(u);
}

In MVC app using JQGrid - how to set user data in the controller action

How do you set the userdata in the controller action. The way I'm doing it is breaking my grid. I'm trying a simple test with no luck. Here's my code which does not work. Thanks.
var dataJson = new
{
total =
page = 1,
records = 10000,
userdata = "{test1:thefield}",
rows = (from e in equipment
select new
{
id = e.equip_id,
cell = new string[] {
e.type_desc,
e.make_descr,
e.model_descr,
e.equip_year,
e.work_loc,
e.insp_due_dt,
e.registered_by,
e.managed_by
}
}).ToArray()
};
return Json(dataJson);
I don't think you have to convert it to an Array. I've used jqGrid and i just let the Json function serialize the object. I'm not certain that would cause a problem, but it's unnecessary at the very least.
Also, your user data would evaluate to a string (because you are sending it as a string). Try sending it as an anonymous object. ie:
userdata = new { test1 = "thefield" },
You need a value for total and a comma between that and page. (I'm guessing that's a typo. I don't think that would compile as is.)
EDIT:
Also, i would recommend adding the option "jsonReader: { repeatitems: false }" to your javascript. This will allow you to send your collection in the "rows" field without converting it to the "{id: ID, cell: [ data_row_as_array ] }" syntax. You can set the property "key = true" in your colModel to indicate which field is the ID. It makes it a lot simpler to pass data to the grid.

Resources