I've only been using the jsonCPP lib for a couple of month now. Im trying to add and remove an object in an array. Ive used a number of different JSON libs on different platforms but I'm finding it very difficult to work with JsonCPP.
Here is the Json:
{ "type": "Disc",
"media": "DVD",
"adapter": "DVDCodecs",
"transportControls" : [
{"Action":"Left", "ActionCode" : "1a"},
{"Action":"Right", "ActionCode" : "2a"},
{"Action":"Up", "ActionCode" : "1b"},
{"Action":"Down", "ActionCode" : "4c"},
{"Action":"Center", "ActionCode" : "5e"},
{"Action":"OK", "ActionCode" : "5a"},
{"Action":"SubTitles", "ActionCode" : "3b"},
{"Action":"SubTitlesLang", "ActionCode" : "7d"},
{"Action":"Audio", "ActionCode" : "7a"},
{"Action":"Angle", "ActionCode" : "6a"},
{"Action":"Next", "ActionCode" : "6c"},
{"Action":"Previous", "ActionCode" : "8b"},
{"Action":"DVDMenu", "ActionCode" : "8c"},
{"Action":"Search", "ActionCode" : "8d"},
{"Action":"Region", "ActionCode" : "3a"},
{"Action":"Display", "ActionCode" : "2e"},
{"Action":"RootMenu", "ActionCode" : "6b"},
{"Action":"FastForward", "ActionCode" : "81"},
{"Action":"Rewind", "ActionCode" : "8b"},
{"Action":"FrameForward", "ActionCode" : "8c"},
{"Action":"Parking"},
{"Action":"Seekable"}
]
}
I've been trying to add and remove a objectValue to and from the transportControls array. To add an object I have been doing this:
Json::Value addObj;
Json::Reader reader;
reader.parse("{\"Action\":\"BlueButton\", \"ActionCode\" : \"9a\"}", addObj );
root["transportControls"].append( addObj );
Which seems to work well. If there is a more elegant way of doing this I'd like to know that to.
My problem is how do I remove it after I added it. I can remove all the members in the object but that doesn't actually seem to remove the object from the arrayValue map.
What is the "best practice" way to remove an Object Value from an Array Value using JsonCPP?
I finally had some time to dig into the source code and the easy answer is - No.
An arrayValue object is really just an ObjectValue defined as an std::map. If you call std::map::erase() on an object in the map you will interrupt the consecutive sequence index_ Key for the array. Sdt::Maps don’t allow you to edit the Key in the map so you would have to move all the Value object pointers in the map up one and delete the last entry before end() to actually "delete" the object.
That sounds like a lot of overhead. Why do I have to move everything up? ...you may ask. Because the JsonCPP Writer Classes use the map[] index to print out the values. If it doesn't find the key ( because of a gap in the series) it returns nullValue for that index. That’s what you see when you call root.toStyledString() to convert back to a string. After a while you have these "null," all over the place. From a Value object, if you’re not calling the const version ( const Value &operator[]( ArrayIndex index ) const; )you will insert the nullValue object into the Array. The parser uses the Value::operator[]( ArrayIndex index ) version to insert new dafaultValue objects into the map while its tokenizing your JSON.
Answer: No. You can’t delete an object from an arrayValue without making code changes to clean up the map.
More info here: Changing the key of an element inside a std::map
There is now removeIndex(), but as tommygr says, it is an expensive operation in the current implementation.
Json::Value new_items;
int c = 0;
for(int i = 0; i < items.size(); i++)
{
if(items[i] != selected_item)
{
new_items[c] = items[i];
c++;
}
}
items = new_items;
Related
I have a JSON response from web service that looks something like this :
[
{
"id":4,
"sourceID":null,
"subject":"SomeSubjectOne",
"category":"SomeCategoryTwo",
"impact":null,
"status":"completed"
},
{
"id":12,
"sourceID":null,
"subject":"SomeSubjectTwo",
"category":"SomeCategoryTwo",
"impact":null,
"status":"assigned"
}
]
What I need to do is extract the subjects from all of the entities by using JSONPATH query.
How can I get these results :
Subject from the first item - SomeSubjectOne
Filter on specific subject value from all entities (SomeSubjectTwo for example)
Get Subjects from all entities
Goessner's orinial JSONPath article is a good reference point and all implementations more or less stick to the suggested query syntax. However, implementations like Jayway JsonPath/Java, JSONPath-Plus/JavaScript, flow-jsonpath/PHP may behave a little differently in some areas. That's why it can be important to know what implementation you are actually using.
Subject from the first item
Just use an index to select the desired array element.
$.[0].subject
Returns:
SomeSubjectOne
Specific subject value
First, go for any elements .., check those with a subject [?(#.subject] and use == '..' for comparison.
$..[?(#.subject == 'SomeSubjectTwo')]
Returns
[ {
"id" : 12,
"sourceID" : null,
"subject" : "SomeSubjectTwo",
"category" : "SomeCategoryTwo",
"impact" : null,
"status" : "assigned" } ]*
Get all subjects
$.[*].subject
or simply
$..subject
Returns
[ "SomeSubjectOne", "SomeSubjectTwo" ]
I am trying to write R script in which I have to do some operations on mongo database. Therefore, I have a few questions:
How to use double condition? I know how to use single condtion
mongoDB$find(query = '{"id" : { "$in" : ["1","2"]}}')
mongoDB$find(query = '{"date" : { "$in" : "2019-08-09"}}')
How should I connect both conditions in one query? How to write such code?
How to use parameter in mongo code? In my script I will have vector with dynamic number of IDs. How I should write query? I am looking for something like:
VectorWithIDs <- c(1:1000)
mongoDB$find(query = '{"id" : { "$in" : VectorWithIDs}}')
Any Ideas how to solve both problems? Thanks in advance!
Just do the query with a "," as a separation token.
For example:
mongoDB$find(query = '{"id" : { "$in" : ["1","2"]}, '': "date" : { "$in" : "2019-08-09"}}'}')
I am trying to setup a simple connect between a NodeJS application and an R-script.
So far I've managed to set up the basic connection and the running of the cript, by using r-script (found on npm).
However, I am not able to pass a simple json to the R so that it can be converted to a dataframe using jsonlite.
NodeJS Code:
var R = require("r-script");
var out = R("script.R")
.data({"Name" : "Mario", "Age" : 32, "Occupation" : "Plumber"}, {"Name" : "Peach", "Age" : 21, "Occupation" : "Princess"}, {}, {"Name" : "Bowser", "Occupation" : "Koopa"})
.callSync();
console.log(out);
script.R:
library("jsonlite")
mydf <- fromJSON(input[[1]])
This gives the output:
'Argument 'txt' must be a JSON string, URL or file.'
I have tried removing the indexation of the vector (and give the full list to the fromJSON) but it also doesn't work.
Has anyone sucessfully passed JSON to an R script with this npm module?
Thanks in advance!
EDIT:
Also, if place the JSON between "'", it gives me "trailing errors" in spaces, and if they are removed, on the { char.
I have no idea about how R works, however the error could point to the fact that what you are trying to pass is a javascript object, not a JSON. Javascript objects look like but are not identical to JSONs (see discussion here Javascript object Vs JSON)
One thing that you can try is passing in the data function a JSON string by calling the JSON.stringify function on your objects. Something like this:
var R = require("r-script");
var out = R("script.R")
.data(JSON.stringify({"Name" : "Mario", "Age" : 32, "Occupation" : "Plumber"}), JSON.stringify( {"Name" : "Peach", "Age" : 21, "Occupation" : "Princess"}),JSON.stringify ({}), JSON.stringify({"Name" : "Bowser", "Occupation" : "Koopa"}))
.callSync();
console.log(out);
It's a longshot but it shows a general direction on which you can debug
I managed to found a simple solution, the Manos Agelidis answer did put me in the right direction.
Basically, I have able to parse the input string using:
string = paste(input, collapse=",")
string = paste("[",string,"]")
frame = fromJSON(string)
and in NodeJS:
var out = R("script.R")
.data('{"Name":"Mario","Age":32,"Occupation":"Plumber"}',
'{"Name":"Peach","Age":21,"Occupation":"Princess"}',
'{}',
'{"Name":"Bowser","Occupation":"Koopa"}')
.callSync();
fromJSON requires a vector but in string form, and not a literal vector. Therefore, what I needed to do what to create a string using the elements of the input and to add [ and ] to the begging and end of the string, respectively. This did convert to a data frame properly.
I am new to meteor and mongoDB and have been searching for an answer to this question for some time without any luck.
I have multiple documents in MongoDB similar to the one below:
{
"_id" : ObjectId("5abac4ea0c31d26804421371"),
"Points" : [
{
"Value" : 6.869752766626993,
"Time" : 1522284528946
},
{
"Value" : 3.9014587731230477,
"Time" : 1522284543946
},
{
"Value" : 1.2336926618519772,
"Time" : 1522284558946
},
{
"Value" : 6.504837583667155,
"Time" : 1522284573946
},
{
"Value" : 9.824138227740864,
"Time" : 1522284588946
},
{
"Value" : 9.707480757899235,
"Time" : 1522284603946
},
{
"Value" : 4.6122167850338105,
"Time" : 1522284618946
}
]
}
How can I implement a query in meteor that returns an array containing all the Points from all documents with 'Time' field greater than certain value?
As Jankapunkt has pointed out in his comment, it might be a lot easier and better if you created a new collection Points where each document includes only Value and Time attributes. The given example would then become seven separate documents rather than a single array.
It does nevertheless happen, that we want to query documents according to some inner values, e.g. attributes in objects in arrays.
Taken from the mongodb documentation on querying embedded documents, we can just use dot notation for this.
If you do not know the index position of the document nested in the array, concatenate the name of the array field, with a dot (.) and the name of the field in the nested document.
Such as for your question (assuming Points to be the name of your collection):
db.points.find( { 'Points.Time': { $gte: 123412341234 } } )
Which looks almost identical in Meteor:
Points.find({ 'Points.Time': { $gte: 123412341234 } })
The database structure for a Firebase "GeoFire" node must look like this (source)
"items" : {
<itemId> : {
"someData" : "someData",
...
}
},
"items_location" : {
<itemId> : {
<geofireData> ...
}
}
But one limitation of Geofire is that only single points can be stored and queried, and no objects like polygons. That's easy to work around - my code can query nearby points, and then reassemble the simple rectangles based on having the same key.
But in splitting my rectangles into individual points, I've created a GeoFire key with the following format
ABCD_0
Where ABCD is the original Primary Key of the rectangle, and _0 indicates which corner, so as to have each point with a unique key. One rectangle is represented in GeoFire as
"items" : {
<ABCD_0> : {<objectData>},
<ABCD_1> : {<objectData>},
<ABCD_2> : {<objectData>},
<ABCD_3> : {<objectData>}
},
"items_location" : {
<ABCD_0> : {<geofireData 0>},
<ABCD_1> : {<geofireData 1>},
<ABCD_2> : {<geofireData 2>},
<ABCD_3> : {<geofireData 3>}
}
But then to force identical keys in items and items_location, <objectData> is 4x redundant.
In order to decrease data volume, I'd like to use the original Primary Key in the items node, and then replicate the key with the _X structure for the items_location node. The App would then query GeoFire, get a list of (4) nearby keys, and then string-parse ABCD_X into ABCD, which it would use for the subsequent query.
"items" : {
<ABCD> : {<objectData>},
},
"items_location" : {
<ABCD_0> : {<geofireData 0>},
<ABCD_1> : {<geofireData 1>},
<ABCD_2> : {<geofireData 2>},
<ABCD_3> : {<geofireData 3>}
}
Will this break how GeoFire stores, indexes, retrieves and offlines data?
I'm especially concerned about how small sets of data is synchronized offline for individual apps. The entire geo-dataset is too large for a single app to store in its entirety offline.