I tried following the dotnetrdf documentation to store data in a Virtuoso Server.
Here is what I do:
public void LoadGraph()
{
//Create our Storage Provider - this example uses Virtuoso Universal Server
VirtuosoManager virtuoso = new VirtuosoManager("creativeartefact.org", 1111, "DB", "user", "password");
//Load the Graph into an ordinary graph instance first
Graph g = new Graph();
virtuoso.LoadGraph(g, new Uri("http://creativeartefact.org/"));
//Then place the Graph into a wrapper
StoreGraphPersistenceWrapper wrapper = new StoreGraphPersistenceWrapper(virtuoso, g);
//Now make changes to this Graph as desired...
g.Assert(g.CreateUriNode(new Uri("http://creativeartefact.org/testB/123")), g.CreateUriNode("rdf:Type"), g.CreateUriNode(new Uri("http://creativeartefact.org/ontology/Artist")));
wrapper.Flush(); // mandatory, but doesn't help either
//Remember to call Dispose() to ensure changes get persisted when you are done
wrapper.Dispose();
}
But - no data is saved and no exception is thrown. After inserting the triple, the triple count is raised by 1 as expected, but it doesn't make it's way into the database. The triple count is back to the old value when I rerun the code. The user account has write permissions.
Any idea what I am missing?
Thanks in advance,
Frank
You're not fully reading the doc you're following.
The easiest way to save a Graph to a Triple Store is to use the SaveGraph() method of an IStorageProvider implementation, see Triple Store Integration for more details.
Related
Firebase Realtime Database Overrides my Data at a location even when I use .push() method. The little-concrete knowledge I have about writing to Firebase Realtime database is that writing to Firebase real time database can be done in few several ways. Two of the most prominent are the
set() and 2. push() method.
The long story short, push() is used to create a new key for a data to be written and it adds data to the node.
So fine, firebase has being co-operating with me in my previous projects but in this, I have no idea what is going on. I have tried different blends of push and set to achieve my goal but no progress so far.
In the code below, what I want to achieve is 2 things, write to a location chatUID, message and time only once, but write severally '-MqBBXPzUup7czdG2xCI' all under the same node "firebaseGeneratedId1" ->
A better structure is below.
Help with code. Thanks.
UPDATE
Here is my code
The writers reference
_listeningMsgRef = _msgDatabase
.reference()
.child('users')
.child(userId)
.child('chats')
.child(chatUIDConcat);
When a user hits sendMessage, here is the function called
void sendMessage() {
_messageController.clear();
var timeSent = DateTime.now().toString();
//Send
Map msgMap = {
'message': msg,
'sender': userId,
'time': timeSent,
'chatUID': chatUIDConcat
};
//String _key = _listeningMsgRef.push().key;
_listeningMsgRef.child(chatUIDConcat).set().whenComplete(() {
SnackBar snackBar = const SnackBar(content: Text('Message sent'));
ScaffoldMessenger.of(context).showSnackBar(snackBar);
DatabaseReference push = _listeningMsgRef.child(chatUIDConcat).push().set(msgMap);
});
}
The idea about the sendMessage function, is to write
chatUID:"L8pacdUOOohuTlifrNYC3JALQgh2+q5D38xPXVBTwmwb5Hq..."
message: "I'm coming"
newMessage: "true"
sender: "L8pacdUOOohuTlifrNYC3JALQgh2"
When it is complete, then push new nodes under the user nodes.
EDIT:
I later figured out the issue. I wasn't able to achieve my goal because I was a bit tensed while doing that project. The issue was I was wanted to write new data into the '-MqBBXPzUup7czdG2xCI' node without overwriting the old data in it.
The solution is straight forward. I just needed to ensure I wrote data in that node as new nodes under it. Nothing much, thanks
Frank van Puffelen for your assistance.
Paths in Firebase Realtime Database are automatically created when you write any data under then, and deleted when you remove the last data under them.
So you don't need to first create the node for the chat room. Instead, it gets auto-created when you write the first message into it with _listeningMsgRef.child(chatUIDConcat).push().set(msgMap)
Should I use akavache as a primary local database in my xamarin forms application or a cache database on top of another sqlite database? Because I cant find any example of how to update, insert, delete data in akavache object. For example,
CacheItems= await BlobCache.LocalMachine.GetOrFetchObject<List<Item>>("CacheItems",
async () => await getdatafromAzure(recursive));
I am getting items from azure and store in local machine and these items are editable / deleteable or user can add a new item. How do I do it?
Anything saved to LocalMachine gets persisted physically to the device. So on app or device restart it'll still be there (if the user hasn't removed the app or cleared the data that is)
As far as how to access/save there's lots of good samples here
https://github.com/akavache/Akavache
Insert Object and Get Object are your basic access methods and then there's lots of extension methods like GetOrFetch, GetAndFetch, which are very useful
Here's a quick sample I haven't super tested to give one way to access stuff. It'd probably be better to use some of the extension methods but I figure an example like this is conceptually useful.
BlobCache.LocalMachine.GetObject<Tobject>(someKEy)
.Catch((KeyNotFoundException ke) => Observable.Return<Tobject>(null))
.SelectMany(result =>
{
//object doesn't exist in cache so create a new one
if (result == null)
result = new Tobject();
//apply whatever updates you are wanting to do
//result.SomeField = "bob";
//This will replace or insert data
return BlobCache.LocalMachine.InsertObject(someKEy, result);
})
.Subscribe();
It's really all pretty boring stuff :-p Just get an object and store an object. Under the hood Akavache does a lot of really cool optimizations and synchronizations around that boring stuff though allowing it to be boring for the rest of us
In most of my cases when I start up a VM I retrieve the object from the cache and then just store it as some property on the VM or inside some service class. Then when any changes are made to the object I just insert it into the cache
BlobCache.LocalMachine.InsertObject(someKEy, result).Subscribe()
At that point I know now if the app closes down I'll have the latest version of that object right when user starts up app
The examples I gave are more the full Rx way of accessing... What you have in your original question works fine
await BlobCache.LocalMachine.GetOrFetchObject<List<object>>("CacheItems",
async () => await getdatafromAzure(recursive));
That will basically check the cache if it doesn't exist then it goes to Azure...
LocalMachine stores to physical device, InMemory just stores to some internal dictionary that goes away once the app is unloaded from memory, and UserAccount works with NT remoting accounts.
I'm trying to implement simple DDD/CQRS architecture without event-sourcing for now.
Currently I need to write some code for adding a notification to a document entity (document can have multiple notifications).
I've already created a command NotificationAddCommand, ICommandService and IRepository.
Before inserting new notification through IRepository I have to query current user_id from db using NotificationAddCommand.User_name property.
I'm not sure how to do it right, because I can
Use IQuery from read-flow.
Pass user_name to domain entity and resolve user_id in the repository.
Code:
public class DocumentsCommandService : ICommandService<NotificationAddCommand>
{
private readonly IRepository<Notification, long> _notificationsRepository;
public DocumentsCommandService(
IRepository<Notification, long> notifsRepo)
{
_notificationsRepository = notifsRepo;
}
public void Handle(NotificationAddCommand command)
{
// command.user_id = Resolve(command.user_name) ??
// command.source_secret_id = Resolve(command.source_id, command.source_type) ??
foreach (var receiverId in command.Receivers)
{
var notificationEntity = _notificationsRepository.Get(0);
notificationEntity.TargetId = receiverId;
notificationEntity.Body = command.Text;
_notificationsRepository.Add(notificationEntity);
}
}
}
What if I need more complex logic before inserting? Is it ok to use IQuery or should I create additional services?
The idea of reusing your IQuery somewhat defeats the purpose of CQRS in the sense that your read-side is supposed to be optimized for pulling data for display/query purposes - meaning that it can be denormalized, distributed etc. in any way you deem necessary without being restricted by - or having implications for - the command side (a key example being that it might not be immediately consistent, while your command side obviously needs to be for integrity/validity purposes).
With that in mind, you should look to implement a contract for your write side that will resolve the necessary information for you. Driving from the consumer, that might look like this:
public DocumentsCommandService(IRepository<Notification, long> notifsRepo,
IUserIdResolver userIdResolver)
public interface IUserIdResolver
{
string ByName(string username);
}
With IUserIdResolver implemented as appropriate.
Of course, if both this and the query-side use the same low-level data access implementation (e.g. an immediately-consistent repository) that's fine - what's important is that your architecture is such that if you need to swap out where your read side gets its data for the purposes of, e.g. facilitating a slow offline process, your read and write sides are sufficiently separated that you can swap out where you're reading from without having to untangle reads from the writes.
Ultimately the most important thing is to know why you are making the architectural decisions you're making in your scenario - then you will find it much easier to make these sorts of decisions one way or another.
In a project i'm working i have similar issues. I see 3 options to solve this problem
1) What i did do is make a UserCommandRepository that has a query option. Then you would inject that repository into your service.
Since the few queries i did need were so simplistic (just returning single values) it seemed like a fine tradeoff in my case.
2) Another way of handling it is by forcing the user to just raise a command with the user_id. Then you can let him do the querying.
3) A third option is ask yourself why you need a user_id. If it's to make some relations when querying the data you could also have this handles when querying the data (or when propagating your writeDB to your readDB)
I'm researching how to use Realm DB for my new project and I got some issues. Please share your experiences when you worked on Realm DB. Sorry about my long question list.
First, please refer my sample code
Dog class
Dog.h
#interface Dog : RLMObject
#property NSString *name;
#property NSInteger age;
#end
RLM_ARRAY_TYPE(Dog) // define RLMArray<Dog>
And Person class
Person.h
#interface Person : RLMObject
#property NSString *name;
#property NSInteger age;
// to-many relationship
#property RLMArray<Dog *><Dog> *dogs;
#end
RLM_ARRAY_TYPE(Person) // define RLMArray<Person>
Then, I create data for person and dog (1000000 records) - 1 person will have 1-many dogs. And I got stuck in some cases
How to get people have dog with dog name is "Rex"? I research it but there is no guide for objective C
The performance in deleting an object seems to be slow
I tried to delete person has a name is "Ethan" and performance is slow and crash app when it execute a half list. I guess I used incorrect way to delete object.
RLMResults *people = [Person objectsWhere:#"name == 'Ethan'"];
// Get the default Realm
RLMRealm *realm = [RLMRealm defaultRealm];
[realm beginWriteTransaction];
for (int i=0; i< people.count; i++) {
Person *aPerson = [people objectAtIndex:i];
[realm deleteObject:aPerson];
}
[realm commitWriteTransaction];
The result is >52000 record with have a name is "Ethan" and app only delete a half of them (26000)
I don't know how to delete record with condition with Realm. I think I will write the code like below for my question #2
[Person deleteWhereObject:#"name = 'Ethan'"];
It is not yet clear on how can rename, delete or add new column to the DB after it is created (apart from using the most simple way which is delete the DB and recreate it again)
The tools (Realm browser) for browsing the data file created on desktop doesn't provide much flexibility in querying data. It only allow browse through the data but not allow querying using a certain condition. Please guide me to query data with this tool if I lost something.
For troubleshooting, may be I have existing DB from client app and I want to import it in my project to troubleshoot client's bugs. So, how can I do it with Realm DB?
After insert 1000000 records for Person and Dog table, DB size is 52.8 MB. But DB size increase to 92.3 MB after I call delete all data
// delete all object
[realm beginWriteTransaction];
[realm deleteAllObjects];
[realm commitWriteTransaction];
Then, I insert data again and file size continue to increase. I don't know what's wrong in my steps.
Hope getting your support soon!
RLMResults *people = [Person objectsWhere:#"ANY dogs.name == 'Rex'"];
Use deleteObjects: with the retrieved RLMResults instead of deleting every single object yourself. If you enumerate, I'd recommend to use NSFastEnumeration with for (… in …). Note that RLMResults are auto-updated, so if you do concurrent changes, then you can run into surprises.
See my answer for question 2.
Change your schema, bump the schemaVersion of your RLMRealmConfiguration and provide a migration block, if you rename properties or change their type.
This isn't possible yet, but still in the making. This is tracked by issue #28 in the realm-browser-osx repo.
Put the realm file in the app bundle and copy it at runtime into the user data directory (e.g. RLMRealm.defaultConfiguration.path), if you want to be able to write to the file.
The file size blows up, because no compaction is happening. You can enforce that by writing a compacted copy with writeCopyToPath:. You could do this e.g. at app start, because otherwise it can be non-trivially to make sure that all RLMRealm instances are teared down before.
I am trying to make a mobile app that inserts information into a database and while I have it pulling information and I have created my services, when I use the standard create service that Flash Builder 4.6 automatically makes, it just does not insert anything at all into the database.
I have tried everything and I am now with the following code:
I first create a variable array with the values of the service call
....
protected var Coordinatelist2:Coordinatelist = new Coordinatelist();
I then created a function to fill in the information. The variables SesID, Lat and Long are declared when my button is pressed.
....
protected function createCoordinatelist(item:Coordinatelist):void
{
Coordinatelist2.SessionID = SesID;
Coordinatelist2.Latitude = Lat;
Coordinatelist2.Longitude = Long;
createCoordinatelistResult.token = coordinatelistService.createCoordinatelist(Coordinatelist2);
}
After this, I then go and add the following line of code to the end of my button function.
.....
createCoordinatelist(Coordinatelist2);
Now, as far as I am concerned, this should then be writing to the database the items of SesID, Lat and Long using the created service token, but when I do this, nothing has entered into the database at all.
Am I doing something wrong here?
Sorry for the confusion but I came right.
What I found I had not done was commit my token once I created it. I simply used createCoordinatelistResult.token = coordinatelistService.commit(); and it worked perfectly.
Thanks for the feedback