Combine Swinject and Realm - realm

thanks for that Framework. I really like the idea and I'm eager to use it! However, I'm currently trying to get this up and running with an app that uses realm as well. I initially tought, It might be a good idea to create a realmService which I inject to my models and which handles all of the realm write stuff.
Sadly, I can not make my mind up on how to do this properly. The Wether App example is great, but it doesn't cover any realm models. Any hint to point me into the correct direction or something? I tried via constructor and property but I just can't get it to work. I guess, I'm missing something conceptual.
Thanks, I'm eager to learn from you :)
Cheers

I just forked the Weather example app and added Realm in there, using Swinjects DI mechanism. Registering the service component pairs could look like this:
container.register(WeatherFetcher.self) { r in
WeatherFetcher(networking: r.resolve(Networking.self)!)
WeatherFetcher(networking: r.resolve(Networking.self)!,
realm: r.resolve(Realm.self)!)
}
container.register(Realm.Configuration.self) { _ in
// not really necessary if you stick to the defaults everywhere
return Realm.Configuration()
}
container.register(Realm.self) { r in
try! Realm(configuration: r.resolve(Realm.Configuration.self)!)
}

Related

How to get Axon event-identifier from the event-store

Just a short question here...
by using Axon, we know that AggregateLifecycle#apply(Object) will be doing the event-sourced for us which under the hood going to persist our event into our event-store.
With regards to that matter, how to get the event-identifier (not the aggregate identifier) once we call that particular apply method ?
Thanks
Based on your another answer, let me suggest you a way to follow.
The MessageIdentifier as used by AxonFramework (AF) is nothing more than an UUID generated for each Message you create.
Since you only need to reuse that info, you can pretty much get it from the Message while handling it. To make things easier for you, Axon provides a MessageIdentifierParameterResolver meaning you can simply use it in any #MessageHandler of you (of course, I am assuming you are using Spring as well).
Example:
#EventHandler
public void handle(Event eventToBeForwarded, #MessageIdentifier String messageIdentifier) {
// forward the event to another broker using the given `messageIdentifier`
}
Hope that helps you and make things clear!

Register callback in Autofac and build container again in the callback

I have a dotnet core application.
My Startup.cs registers types/implementations in Autofac.
One of my registrations needs previous access to a service.
var containerBuilder = new ContainerBuilder();
containerBuilder.RegisterSettingsReaders(); // this makes available a ISettingsReader<string> that I can use to read my appsettings.json
containerBuilder.RegisterMyInfrastructureService(options =>
{
options.Username = "foo" //this should come from appsettings
});
containerBuilder.Populate(services);
var applicationContainer = containerBuilder.Build();
The dilemma is, by the time I have to .RegisterMyInfrastructureService I need to have available the ISettingsReader<string> that was registered just before (Autofac container hasn't been built yet).
I was reading about registering with callback to execute something after the autofac container has been built. So I could do something like this:
builder.RegisterBuildCallback(c =>
{
var stringReader = c.Resolve<ISettingsReader<string>>();
var usernameValue = stringReader.GetValue("Username");
//now I have my username "foo", but I want to continue registering things! Like the following:
containerBuilder.RegisterMyInfrastructureService(options =>
{
options.Username = usernameValue
});
//now what? again build?
});
but the problem is that after I want to use the service not to do something like starting a service or similar but to continue registering things that required the settings I am now able to provide.
Can I simply call again builder.Build() at the end of my callback so that the container is simply rebuilt without any issue? This seems a bit strange because the builder was already built (that's why the callback was executed).
What's the best way to deal with this dilemma with autofac?
UPDATE 1: I read that things like builder.Update() are now obsolete because containers should be immutable. Which confirms my suspicion that building a container, adding more registrations and building again is not a good practice.
In other words, I can understand that using a register build callback should not be used to register additional things. But then, the question remain: how to deal with these issues?
This discussion issue explains a lot including ways to work around having to update the container. I'll summarize here, but there is a lot of information in that issue that doesn't make sense to try and replicate all over.
Be familiar with all the ways you can register components and pass parameters. Don't forget about things like resolved parameters, modules that can dynamically put parameters in place, and so on.
Lambda registrations solve almost every one of these issues we've seen. If you need to register something that provides configuration and then, later, use that configuration as part of a different registration - lambdas will be huge.
Consider intermediate interfaces like creating an IUsernameProvider that is backed by ISettingsReader<string>. The IUsernameProvider could be the lambda (resolve some settings, read a particular one, etc.) and then the downstream components could take an IUsernameProvider directly.
These sorts of questions are hard to answer because there are a lot of ways to work around having to build/rebuild/re-rebuild the container if you take advantage of things like lambdas and parameters - there's no "best practice" because it always depends on your app and your needs.
Me, personally, I will usually start with the lambda approach.

Where did LoaderService go?

Upgrading AngleSharp from 0.9.6 to 0.9.9 I have this line of code no longer compiling:
return configuration.With(LoaderService(new[] { requester }));
It complains that LoaderService does not exist in the current context. So what happened to LoaderService? Is there a replacement? Does it still exist but just somewhere else?
Good question. Sorry for being late to the party, but even though you may have solved your problem someone else is having a hard time figuring it out.
LoaderService was essentially just a helper to create a loader. But having a service for anything creating a little thing would be overkill and not scale much. Also AngleSharp.Core would need to define all these. So, instead a generic mechanism was introduced, which allows registering such "creator services" via Func<IBrowsingContext, TService>.
However, to solve your piece of code I guess the following line would do the trick:
return configuration.WithDefaultLoader(requesters: requester);
This registers the default loader creator services (one for documents, one for resources inside documents) with the default options (options involve some middleware etc.).
Under the hood (besides some other things) the following is happening:
// just one example, config.Filter is created based on the passed in options
return configuration.With<IDocumentLoader>(ctx => new DocumentLoader(ctx, config.Filter));

Realm model migration strategy

I have run into a problem working with Realm migration blocks and the strategy for migrating realms.
Given an object MyObject with a number of properties:
In version 1 we have the property myProperty
In version 2 we change the property to myPropertyMk2
In version 3 we change the property to myPropertyMk3
Given following migration block:
private class func getMigrationBlock(realmPath: String) -> RLMMigrationBlock {
return { migration, oldSchemaVersion in
if (oldSchemaVersion == RLMNotVersioned) {
NSLog("No database found when migrating.")
return
} else {
NSLog("Migrating \(realmPath) from version \(oldSchemaVersion) to \(RealmMigrationHelper.CURRENT_DATABASE_VERSION)")
}
NSLog("Upgrading MyObject from version %d to %d", oldSchemaVersion, CURRENT_DATABASE_VERSION)
if (oldSchemaVersion < 2) {
migration.enumerateObjects(MyObject.className(), block: {
oldObject, newObject in
newObject["myPropertyMk2"] = oldObject["myProperty"]
})
}
if (oldSchemaVersion < 3) {
migration.enumerateObjects(MyObject.className(), block: {
oldObject, newObject in
newObject["myPropertyMk3"] = oldObject["myPropertyMk2"]
})
}
NSLog("Migration complete.")
}
}
When I was version 2 of the DB this worked just fine (obviously without the oldSchemaVersion < 3 block), but when I introduced version 3 I started getting the problems because it does not recognise the newObject["myPropertyMk2"] in oldSchemaVersion < 2 block. If I change it to newObject["myPropertyMk3"] it works just fine.
From reading the RLMMigration code this makes perfectly good sense as we work with the old schme and the new scheme, but based on the documentation on realm.io I do not think it makes sense. Then I would have expected it to be scheme less.
I have an idea about making a scheme less migration within the block by simply using a dictionary and then finally apply this dictionary to the newObject.
Are there any thoughts on the migration strategy of realms that deals with this? It is mentioned on realms website, but only very briefly.
Thank you :)
thanks for your question and report of your issue.
From reading the RLMMigration code this makes perfectly good sense as we work with the old schme and the new scheme, but based on the documentation on realm.io I do not think it makes sense. Then I would have expected it to be scheme less.
As you correctly recognized from the code in RLMMigration, migrations are not scheme-free. The migration closure which you provide should handle migrations from any version in the past to the current version. If your user didn't update your app in between and so skipped one version, there is no chance, that Realm could been aware of your intermediate schema version, as the schema is reflected at runtime. You're generally free to break the backwards-compatibility with existing old versions deliberately, but you would need to take care to reset the configuration to a defined state.
You're for sure right about the point, that this could been better documented. I have created an internal ticket about that.
I have an idea about making a scheme less migration within the block by simply using a dictionary and then finally apply this dictionary to the newObject.
Are there any thoughts on the migration strategy of realms that deals with this? It is mentioned on realms website, but only very briefly.
Depending on your scheme and the amount of data you have, you can reorganize it object-wise in memory via a dictionary and then apply it to newObject as you describe. The current API makes relatively few assumptions and allows an approach like this. But it wouldn't work in that way good for everyone, e.g. if you have large lists of related objects.

How do you like to define your module-wide variables in drupal 6?

I'm in my module file. I want to define some complex variables for use throughout the module. For simple things, I'm doing this:
function mymodule_init() {
define('SOME_CONSTANT', 'foo bar');
}
But that won't work for more complex structures. Here are some ideas that I've thought of:
global:
function mymodule_init() {
$GLOBALS['mymodule_var'] = array('foo' => 'bar');
}
variable_set:
function mymodule_init() {
variable_set('mymodule_var', array('foo' => 'bar'));
}
property of a module class:
class MyModule {
static $var = array('foo' => 'bar');
}
Variable_set/_get seems like the most "drupal" way, but I'm drawn toward the class setup. Are there any drawbacks to that? Any other approaches out there?
I haven't seen any one storing static values that are array objects.
For simple values the drupal way is to put a define in the begining of a modules .module file. This file is loaded when the module is activated so that is enough. No point in putting it in the hook_init function.
variable_set stores the value in the database so don't run it over and over. Instead you could put it in your hook_install to define them once. variable_set is good to use if the value can be changed in an admin section but it's not the best choice to store a static variable since you will need a query to fetch it.
I think all of those methods would work. I have never used it in this fashion, but I believe the context module (http://drupal.org/project/context) also has its own API for storing variables in a static cache. You may want to check out the documentation for the module.
It's always a good practice to avoid globals. So that makes your choice a bit easier.
If you err towards a class, you'll be writing code that is not consistent with the D6 standards. There are a lot of modules that do it but I personally like to keep close to the Drupal core so I can understand it better. And code that's written in different styles through the same application can have an adverse effect on productivity and maintenance.
variable_set() and define() are quite different. Use the former when you can expect that information to change (a variable). Use the latter for constants. Your requirements should be clear as which one to use.
Don't worry too much about hitting the database for variable_set/get. If your software is written well, it should not effect performance hardly at all. Performance work arounds like that should only be implemented if your applications has serious performance issues and you've tried everything else.

Resources