How can query nested object in Realm DB with Objective C - realm

I'm researching how to use Realm DB for my new project and I got some issues. Please share your experiences when you worked on Realm DB. Sorry about my long question list.
First, please refer my sample code
Dog class
Dog.h
#interface Dog : RLMObject
#property NSString *name;
#property NSInteger age;
#end
RLM_ARRAY_TYPE(Dog) // define RLMArray<Dog>
And Person class
Person.h
#interface Person : RLMObject
#property NSString *name;
#property NSInteger age;
// to-many relationship
#property RLMArray<Dog *><Dog> *dogs;
#end
RLM_ARRAY_TYPE(Person) // define RLMArray<Person>
Then, I create data for person and dog (1000000 records) - 1 person will have 1-many dogs. And I got stuck in some cases
How to get people have dog with dog name is "Rex"? I research it but there is no guide for objective C
The performance in deleting an object seems to be slow
I tried to delete person has a name is "Ethan" and performance is slow and crash app when it execute a half list. I guess I used incorrect way to delete object.
RLMResults *people = [Person objectsWhere:#"name == 'Ethan'"];
// Get the default Realm
RLMRealm *realm = [RLMRealm defaultRealm];
[realm beginWriteTransaction];
for (int i=0; i< people.count; i++) {
Person *aPerson = [people objectAtIndex:i];
[realm deleteObject:aPerson];
}
[realm commitWriteTransaction];
The result is >52000 record with have a name is "Ethan" and app only delete a half of them (26000)
I don't know how to delete record with condition with Realm. I think I will write the code like below for my question #2
[Person deleteWhereObject:#"name = 'Ethan'"];
It is not yet clear on how can rename, delete or add new column to the DB after it is created (apart from using the most simple way which is delete the DB and recreate it again)
The tools (Realm browser) for browsing the data file created on desktop doesn't provide much flexibility in querying data. It only allow browse through the data but not allow querying using a certain condition. Please guide me to query data with this tool if I lost something.
For troubleshooting, may be I have existing DB from client app and I want to import it in my project to troubleshoot client's bugs. So, how can I do it with Realm DB?
After insert 1000000 records for Person and Dog table, DB size is 52.8 MB. But DB size increase to 92.3 MB after I call delete all data
// delete all object
[realm beginWriteTransaction];
[realm deleteAllObjects];
[realm commitWriteTransaction];
Then, I insert data again and file size continue to increase. I don't know what's wrong in my steps.
Hope getting your support soon!

RLMResults *people = [Person objectsWhere:#"ANY dogs.name == 'Rex'"];
Use deleteObjects: with the retrieved RLMResults instead of deleting every single object yourself. If you enumerate, I'd recommend to use NSFastEnumeration with for (… in …). Note that RLMResults are auto-updated, so if you do concurrent changes, then you can run into surprises.
See my answer for question 2.
Change your schema, bump the schemaVersion of your RLMRealmConfiguration and provide a migration block, if you rename properties or change their type.
This isn't possible yet, but still in the making. This is tracked by issue #28 in the realm-browser-osx repo.
Put the realm file in the app bundle and copy it at runtime into the user data directory (e.g. RLMRealm.defaultConfiguration.path), if you want to be able to write to the file.
The file size blows up, because no compaction is happening. You can enforce that by writing a compacted copy with writeCopyToPath:. You could do this e.g. at app start, because otherwise it can be non-trivially to make sure that all RLMRealm instances are teared down before.

Related

does returning object in insert method violate cqrs pattern?

I have implemented MediatR in my asp.net core web api application.
Our controllers simply send a command or query to MediatR and return the result.
[HttpPost]
public async Task<IActionResult> CreateQuestion(CreateQuestionCommand command)
{
await Mediator.Send(command);
return Ok();
}
Due to the CQRS pattern that says commands should not return any value we don't return any value in our MediatR commands.
everything was normal since we decided to write some BDD tests.
In our BDD tests there is a simple scenario like this:
Scenario: [Try to add a new Question]
Given [I'm an authorized administrator]
When [I create a new Question with Title '<Title>' and '<Description>' and '<IsActive>' and '<IndicatorId'>]
Then [A question with Title '<Title>' and '<Description>' and '<IsActive>' and '<IndicatorId'> should be persisted in database]
Examples:
| Title | Description | IsActive | IndicatorId |
| This is a test title | this is a test description | true | 3cb23a10-107a-4834-8c1a-3fd893217861 |
We set Id property of Question object in its constructor. which means we don't know what Id we have set to this newly created Question and therefore we can't read it after adding it to database in a test environment.
my question was how to test it in BDD?
this is my test step implementation:
[When(#"\[I create a new Question with Title '([^']*)' and '([^']*)' and '([^']*)' and '([^']*)'>]")]
public async void WhenICreateANewQuestionWithTitleAndAndAnd(string title, string description, bool isActive, Guid indicatorId)
{
var command = new CreateQuestionCommand
{
Title = title,
Description = description,
IndicatorId = indicatorId
};
await questionController.CreateQuestion(command);
}
[Then(#"\[A question with Title '([^']*)' and '([^']*)' and '([^']*)' and '([^']*)'> should be persisted in database]")]
public void ThenAQuestionWithTitleAndAndAndShouldBePersistedInDatabase(string title, string description, string isActive, string indicatorId)
{
//? how to retrieve the created data. We don't have the Id here
}
How can I retrieve the added Question?
Should I change my command handlers to return my object after inserting it to database?
And if I do so, wouldn't it be a CQRS violation?
Thank you for your time.
There's a couple of ways to go about this, but it depends on how you actually expect the system to behave. If you're doing BDD you should focus on the kind of behaviour that would be observable by real users.
If you're implementing a system that allows users to create (and save, I infer) questions in a (Q&A?) database, then should they not have the ability to view or perhaps edit the question afterwards?
As the OP is currently phrased, the trouble writing the test seems to imply a lacking capability of the system in general. If the test can't identify the entity that was created, then how can a user?
The CreateQuestion method only returns Ok(), which means that there's no data in the response. Thus, as currently implemented, there's no way for the sender of the POST request to subsequently navigate to or retrieve the new resource.
How are users supposed to interact with the service? This should give you a hint about how to write the test.
A common REST pattern is to return the address of the new resource in the response's Location header. If the CreateQuestion method were to do that, the test would also be able to investigate that value and perhaps act on it.
Another option is to return the entity ID in the response body. Again, if that's the way things are supposed to work, the test could verify that.
Other systems are more asynchronous. Perhaps you only put the CreateQuestionCommand on a queue, to be handled later. In such a case I'd write a test that verifies that the command was added to the queue. You could also write a more long-running test that then waits for an asynchronous message handler to process the command, but if you want to test something like, you also need to deal with timeouts.
In a comment, you write:
I think he might add the question and then read all questions from database and see if his question is amongst them. He'd probably test without knowing Id.
Do you expect real users of the system to spelunk around in your database? Unless you do, the database is not part of a system's observable behaviour - it's an implementation detail. Thus, behaviour-driven design shouldn't concern itself with the contents of databases, but how a system behaves.
In short, find out how to observe the behaviour you want the system to have, and then test that.

Get and Update Workflow data in Filenet

I am having a hard time figuring out how I will get the workflow data from Filenet. I tried using process engine and content engine but I am lost on where to look at. Should I use PE or CE? also what particular part in the API?
I can already get the list of object stores from CE. Also I can already get the list of search parameters are its data from the PE, but I am lost on how to get the workflow step properties and its data and possible update it thru JAVA API.
You need to query for the workitems using PE API. Assuming workitems are in a queue then,
VWQueueQuery vwQueueQuery = **yourqueue**.createQuery(java.lang.String indexName, java.lang.Object[] firstValues, java.lang.Object[] lastValues, int queryFlags, java.lang.String filter, java.lang.Object[] substitutionVars, int fetchType)
then
while (vwQueueQuery.hasNext()) {
vwStepElement = (VWStepElement) vwQueueQuery.next();
//lock if you want to modify the workitem
vwStepElement.doLock(true);
//once you have vwstepelement, there are different ways to get properties
String[] properties = vwStepElement.getParameterNames();//this will give you all the properties that are exposed for that queue.
//if you want to get a specific property then use
Object specificParameter = vwStepElement.getParameterValue("propName");
//then if you want to set a value
vwStepElement.setParameterValue(parameterName, parameterValue, compareValue);
//finally, if you want save and dispatch to next level
vwStepElement.setSelectedResponse(response);
vwStepElement.doSave(true);
vwStepElement.doDispatch();
}

how to update/insert/delete item in akavache list object?

Should I use akavache as a primary local database in my xamarin forms application or a cache database on top of another sqlite database? Because I cant find any example of how to update, insert, delete data in akavache object. For example,
CacheItems= await BlobCache.LocalMachine.GetOrFetchObject<List<Item>>("CacheItems",
async () => await getdatafromAzure(recursive));
I am getting items from azure and store in local machine and these items are editable / deleteable or user can add a new item. How do I do it?
Anything saved to LocalMachine gets persisted physically to the device. So on app or device restart it'll still be there (if the user hasn't removed the app or cleared the data that is)
As far as how to access/save there's lots of good samples here
https://github.com/akavache/Akavache
Insert Object and Get Object are your basic access methods and then there's lots of extension methods like GetOrFetch, GetAndFetch, which are very useful
Here's a quick sample I haven't super tested to give one way to access stuff. It'd probably be better to use some of the extension methods but I figure an example like this is conceptually useful.
BlobCache.LocalMachine.GetObject<Tobject>(someKEy)
.Catch((KeyNotFoundException ke) => Observable.Return<Tobject>(null))
.SelectMany(result =>
{
//object doesn't exist in cache so create a new one
if (result == null)
result = new Tobject();
//apply whatever updates you are wanting to do
//result.SomeField = "bob";
//This will replace or insert data
return BlobCache.LocalMachine.InsertObject(someKEy, result);
})
.Subscribe();
It's really all pretty boring stuff :-p Just get an object and store an object. Under the hood Akavache does a lot of really cool optimizations and synchronizations around that boring stuff though allowing it to be boring for the rest of us
In most of my cases when I start up a VM I retrieve the object from the cache and then just store it as some property on the VM or inside some service class. Then when any changes are made to the object I just insert it into the cache
BlobCache.LocalMachine.InsertObject(someKEy, result).Subscribe()
At that point I know now if the app closes down I'll have the latest version of that object right when user starts up app
The examples I gave are more the full Rx way of accessing... What you have in your original question works fine
await BlobCache.LocalMachine.GetOrFetchObject<List<object>>("CacheItems",
async () => await getdatafromAzure(recursive));
That will basically check the cache if it doesn't exist then it goes to Azure...
LocalMachine stores to physical device, InMemory just stores to some internal dictionary that goes away once the app is unloaded from memory, and UserAccount works with NT remoting accounts.

Persist Data with DotNetRdf and Virtuoso Manager

I tried following the dotnetrdf documentation to store data in a Virtuoso Server.
Here is what I do:
public void LoadGraph()
{
//Create our Storage Provider - this example uses Virtuoso Universal Server
VirtuosoManager virtuoso = new VirtuosoManager("creativeartefact.org", 1111, "DB", "user", "password");
//Load the Graph into an ordinary graph instance first
Graph g = new Graph();
virtuoso.LoadGraph(g, new Uri("http://creativeartefact.org/"));
//Then place the Graph into a wrapper
StoreGraphPersistenceWrapper wrapper = new StoreGraphPersistenceWrapper(virtuoso, g);
//Now make changes to this Graph as desired...
g.Assert(g.CreateUriNode(new Uri("http://creativeartefact.org/testB/123")), g.CreateUriNode("rdf:Type"), g.CreateUriNode(new Uri("http://creativeartefact.org/ontology/Artist")));
wrapper.Flush(); // mandatory, but doesn't help either
//Remember to call Dispose() to ensure changes get persisted when you are done
wrapper.Dispose();
}
But - no data is saved and no exception is thrown. After inserting the triple, the triple count is raised by 1 as expected, but it doesn't make it's way into the database. The triple count is back to the old value when I rerun the code. The user account has write permissions.
Any idea what I am missing?
Thanks in advance,
Frank
You're not fully reading the doc you're following.
The easiest way to save a Graph to a Triple Store is to use the SaveGraph() method of an IStorageProvider implementation, see Triple Store Integration for more details.

What is the best practice to update a Vertex after is detached from DB with Tinkerpop Frames?

Let's exemplify
I receive a Vertex with Tinkerpop Blueprint, then I use Frames to convert it in an entity.
I close the database (so from now the node is detached from the DB)
and I show the node on a web page to let the user modify it.
The user makes some modifications, then I shoud persist the changes.
The problem is that the Instance of the database is already closed, so the entity is detached from the database: What is the best practice (considering performance and memory usage too) to update the node?
This may be the code example:
FramedGraph<OrientGraph> graph = factory.getFramedGraph();
User user = graph.addVertex(null, User.class);
graph.shutdown();
then I want to update later the node:
user.name = "Donald Duck";
user.... ?
Thank you,
Andrea
I found this way, that seems quite efficient:
public User persistUser(User user){
FramedGraph<OrientGraph> graph = factory.getFramedGraph();
user = graph.frame(user.asVertex(), User.class);
factory.persist();
graph.shutdown();
So the framework automatically merge back the entity to the database.
Then you have to persist.

Resources