I started Meteor a few months ago.
I would like to know if using cursor.observeChanges for buisness objects is a good idea
I want to separate operations and views so I can uses the same operations in many views/events, and I want to know if it is a good idea.
Someone told me, we should not separate operations on mongo from view.
So my question is : Is it a good idea to to Buisness Objects with Meteor ?
Tanks for reading me.
cursor.observeChanges is essentially what you get behind the scenes when you do normal find() queries and bind to template helpers due to its context being reactive.
In the meteor world, the traditional model/view/controller paradigm is shifted towards a reactive data-on-the-wire concept including features like latency compensation.
What you refer to as a business object is basically a representation of your business data which is strongly typed, has a type of its own, atomic, and has only one task of representing.
You can achieve that kind of separation of concerns in any language/framework, including meteor. That only depends on how you lay out, structure and abstract your code.
What Meteor brings into the equation is the toolset to build up an interface to your data with modern ux features that are otherwise very hard/expensive to get.
The only concern over business-class applications could be the fact that Meteor currently employs MongoDB by default. MongoDB has its own discussions around business applications whether they need transaction support, ad-hoc aggregation, foreign key relationships etc. But that is another topic.
Related
Hi I am building a small app to get used to Meteor (and Mongo). Something that is bothering me is the data modelling aspect. Specifically what is the best way to model a many to many relationship. I have read in the Mongo docs that a doc should not be embedded in another doc if you expect it to grow while the original doc remains fairly static.
In my test app students can register for courses. So from the Mongo perspective it makes sense to include the students as an embedded doc in the course as each course will have a limited number of students as opposed to the other way round where, over time, a student could theoretically join unlimited courses.
Then there is the Meteor aspect, I read that a lot of Meteor's features are aimed at separate collections, such as DDP working at the document level so any change in the student array would cause the entire course doc to be resent to every browser, and things like the each spacebars helper works with Mongo cursors but not with arrays etc, etc.
Has anyone dealt with a similar situation and could they explain what approach they took and any drawbacks they had to deal with etc? Thanks.
See this article: https://www.discovermeteor.com/blog/reactive-joins-in-meteor/
And test how good your possible solutions are with this https://kadira.io/
Better use the guide:
http://guide.meteor.com/data-loading.html#publishing-relations
The Meteor team tames (or hides!) the javascript monster to an amazing extent. By using their conventions, you get "free" a ton of much-used functionality "out of the box". Things that are usually re-invented over and over again, accounts, OAuth, live data across clients, standard live-data protocol etc.
But very soon... you need features not in the box. Wow... look at all the choices. Wait a minute, this is the same monster you were fighting before Meteor!
So use the official Meteor Guide. They recommend the wisest ways to extend functionality of your app when you make these choices.
Since they know how they have "hidden the monster", they know how to keep avoiding the monster when you extend.
I'm having problems to understand the conception of DDD. I have an ASP.NET project with this structure:
ASP.NET MVC4 project: xxx.UI.Web
Class Library project: xxx.Application xxx.Domain xxx.Infra.EF
I'm trying to keep this relation:
xxx.UI.Web only have relation with xxx.Domain and xxx.Application
xxx.Domain doesn't have relations.
xxx.Application have relation with xxx.Domain and xxx.Infra.EF
xxx.Infra.EF have relation with xxx.Domain
But now I'm having many problems to keep this concepts with Entity Framework. I have created the Entity repositories in the xxx.Infra.EF and created a generic repository (with interfaces and so on) in xxx.Application.
The problems begin when I need to pass a personalized Entity context to my repositories, because I use the repositories in the xxx.UI.Web and I can't instantiate a new Entity context because It will broke my project pattern (The Entity context comes from xxx.Infra.EF).
My idea is to create many helper methods that will process this kind of operations for my xxx.UI.Web, and I wouldn't like to create this methods in xxx.Application (It looks a little strange to create many methods with a little relation with my business logic).
So I was reading a little about Domain Driven Development (DDD) and I knew about the Service layer, and I think It seems to be the layer that was created to solve problems like this, or not?
My idea is to create a new class library project called xxx.Service and make this project keep relation with xxx.Domain and xxx.Infra.EF. Is It right? I know that I could search for another solutions for my case with Entity context, but I guess I'll have more problems like this in the future with other things, so I tried to find solutions for It. I should study much more about It, but I think I could identify the solution for my problem.
I suggest you further study the concept of layered architecture.
You are correct that the Domain should not reference other layers, such as the Presentation Layer and Infrastructure Layer.
The most common style of layered architecture is the "Onion Architecture". See image and link below.
http://jeffreypalermo.com/blog/the-onion-architecture-part-1/
The Service Layer
You are also correct that a new layer (Service Layer) will solve your problem. You see, the problem arises because the UI/Presentation Layer is talking directly to the repositories(Infrastructure).
In our approach, the Presentation Layer does not communicate directly with the Infrastructure and Domain. We have a Service layer that comes between the Presentation and the Domain.
The above architecture borrows heavily from the Onion Architecture. We don't have domain services. Application Core is our CrossCutting Layer.
People often misunderstand the point of a multi-tiered application. The goal is not necessarily to remove dependencies, but rather to encapsulate and modularize code. Your MVC frontend shouldn't need to be concerned with your entities and how their queried, added, updated, etc., but it doesn't mean it doesn't need access to them. Long and short, don't focus on project references, but rather factoring out code that is not in each project's "domain" into a more appropriate "domain".
To that end, it's not really possible to say based on the information you provided exactly what you should do. The question is opinionated to begin with and will very likely end up being closed as a result. Ultimately, you have to decide what's best for your project. Do what makes sense, rather than blindly following some pattern. After all, patterns are supposed to just be a codification of common sense approaches to common problems, anyways.
I'm thinking through data access for an ASP.NET application. Coming from a company that uses a lot of Windows applications with Client Data sets there is a natural dendancy towards a DataSet approach for dealing with data.
I'm more keen on a Business Object approach and I don't like the idea of caching a DataSet in the session then applying an update.
Does anyone have any experience / help to pass on about the pros and cons of both approaches?
You are smart to be thinking of designing a Data Layer in your app. In an ASP.NET application this will help you standardize and pretty dramatically simplify your data access. You will need to learn how to create and use ObjectDataSources but this is quite straightforward.
The other advantage of a data access layer (built using a separate project/DLL) is that it makes Unit testing much simpler. I'd also encourage you to build a Business Layer to do much of the processing of data (the business layer, for example, would be responsible for pulling ObjectDataSources from the DAL to hand to the UI code). Not only does this let you encapsulate your business logic, it improves the testability of the code as well.
You do not want to be caching DataSets (or DAL objects, for that matter) in the session! You will build a Web app so that record modifications work through a Unique ID (or other primary key spec) and feed changes directly to the DAL as they are made. If you were to cache everything you would dramatically reduce the scalability of your app.
Update: Others on this thread are promoting the idea of using ORMs. I would be careful about adopting a full-blown ORM for reasons that I have previously outlined here and here. I do agree, though, that it would be wise to avoid DataSets. In my own work, I make extensive use of DataReaders to fill my ObjectDataSources (which is trivial due to the design of my DAL) and find it to be very efficient.
DataSets can be incredibly inefficient compared even to other ADO.NET objects like DataReaders. I would suggest going towards the BO/ORM route based off what you are saying.
If you're going to follow Microsoft's direction, then the trend is definitely towards LINQ (ORM) vs. DataSets. When DataSets came into being (ASP.NET 1.0), LINQ wasn't even possible. With LINQ you get type-safety and build-in functions to Create / Update / Delete from the database.
Microsoft has even tried to make the transition easier through LINQ to DataSet.
We're about to do a big update to an existing asp app that used DataSet objects heavily; although I am not looking forward to the pain, I am going to insist on going down the BO route. Just the thought of trying to make datasets work now causes me to break out in a sweat.
I think we are going to go down the LINQ route and use lightweight entity objects.
The company where I work makes heavy use of DataSets as well while there is a business layer as well. BL mainly loads datasets from the DB.
I personally dislike this approach. There is also a practice of direct modifying the datasets after load/before save to meet some immediate needs here and there. To me it really violates the idea of business objects but it's how it is done.
ORM frameworks can really save you a great deal of time, especially in enterprise applications with lots of views with similar buttons and operations.
But it's also easy to lose control. Since that point it will slowly be turning into a mess.
Both options are good when used in right cases. Just don't mix them. Decide to do it one way and follow it.
Mark Brittingham's answer is accurate for two tier applications. But what if I want to use a service tier. DataSets are serializable. Typed DataSets save time over hand coding your own objects. Typed DataSets are extendable. Linq to Entities has performace issues, Linq to SQL is now dead. Linq to DataSet will always be an option.
I will use Typed DataSets and a multi-layered architecture to save time and organize code. I've tried hand coded BOs and the extra time and maintenance time is not worth it.
Getting data from a database table to an object in code has always seemed like mundane code. There are two ways I have found to do it:
have a code generator that reads a database table and creates the
class and controller to map the datafields to the class fields or
use reflection to take the database field and find it on the class.
The problems noted with the above 2 methods are as noted below
Method 1 seems to me like I'm missing something because I have to create a controller for every table.
Method 2 seems to be too labor intensive once you get into heavy data
access code.
Is there a third route that I should try to get data from a database onto my objects?
You normally use OR (Object-Relational) mappers in such situations. A good framework providing OR functionality is Hibernate. Does this answer your question?
I think the answer to this depends on the available technologies for the language you are going to use.
I for one am very successful with the use of an ORM (NHibernate) so naturally I may recommend option one.
There are other options that you may wish to take though:
If you are using .NET, you may opt to use attributes for your class properties to serve either as a mapping within a class, or as data that can be reflected
If you are using .NET, Fluent NHibernate will make it quite easy to make type-safe mappings within your code.
You can use generics so that you will not need to make a controller for every table, although I admit that it will be likely that you will do the latter anyway. However the generics can contain most of the general CRUD methods that is common to all tables, and you will only need to code specific quirks.
I use reflection to map data back and forth and it works well even under heavy data access. The "third route" is to do everything by hand, which may be faster to run but really slow to write.
I agree with lewap, an ORM (object-relational mapper) really helps in these situations. You may also want to consider the Active Record pattern (discussed in Fowler's Patterns of Enterprise Architecture book). It can really speed up creation of the DAL in simple apps.
I'm about to start a new .net web project.
Previsouly for my n-layer web apps i've used the "Microsoft Data Access Application Block" (sqlhelper.cs) for the data access, then an interface class to interface with the object classes. I'm aware this technique is a bit dated and was looking to use something a little more with the times.
I've looked into LINQ to SQL for data access but was restricted by lacking the many to many relationship.
The entity framework was a whole differnet approach that appears to have too larger learning curve.
Would there be anything wrong with using the sqlhelper.cs class to handle my data access?
Not at all. It supports the creation of multi-tier layers which separates our data access from our logic. I usually have business and data classes in separate folders and include the SqlHelper class with my Data DALC class.
I'm looking forward to the move towards LINQ and the use of generics. That's my next step and I think the the use of the SqlHelper promotes good coding practice in the meantime.
I first started using it when I "borrowed" it from the Enterprise Library which was huge. Ditto for the Entity Framework which I have yet to come to grips with in the workplace. but all in good time :-)
You can get simple and good example of sql helper class on http://followprogrammers.blogspot.com/
The last 8 months I've been using Linq and it works great for all the little jobs. The strong typing and drag-and-drop development makes it fantasticlly easy and super quick.
Previous to that, and still when Linq doesn't seem right I use SQLHelper. It cuts out all the donkey work of ADO.NET. I don't see any problem using it.