Experiences with single-instance multi-tenant web application in Seam? - seam

Any experiences with Seam in a one-instance multi-tenant setup? Is Seam suited for that setup? How did you realise it? What were the costs involved?
Our situation: A Seam 2.1 SaaS web-app (POJO, no EJB). Available development budget forced us towards a simplistic one-instance per tenant design. The application is not in production yet but nearly finished.
I expect our customer might reconsider a one-instance multi-tenant setup if it lowers the projected hosting costs.

We've developed a multi-tenant SaaS application with Seam. I don't think that Seam has any advantages or disadvantages for this sort of thing.
The only piece of functionality that is possibly useful are Hibernate Filters (eg. have a company id on every table and set a hibernate filter for it). Means every query will have this ID automatically appended.

I have a class called User, and it has as it's members all of that users data. So, there's a one to many relationship from User to Task, for instance. Then my query for all of a users tasks is simply: select task from Task task, User user where user.id = #{user.id} and task member of user.taskList. I could also have used filters as another has mentioned. However, since the #{user} object is created on log in, it's available via Seams parsing of the EL string. Quite handy.
So, while there is nothing in Seam to support multi-tenant, it's fairly easy to do.

Related

Add users for ASP.NET Core from internal website

Sorry no code here because I am looking for a better idea or if I am on the right track?
I have two websites, lets call them A and B.
A is a website exposed to the internet and only users with valid account can access.
B is a internal (intranet) website with (Windows authentication using Active directory). I want Application B (intranet) to create users for Application A.
Application A is using the inbuilt ASP.NET JWT token authentication.
My idea is to expose a Api on the extranet website (A) and let (B) access this API. I can use CORS to make sure only (B) has access to the end point but I am not sure if this is a good enough protection? We will perform security penetrations test from a third party company so this might fail the security test?
Or
I can use entity framework to a update the AspnetUsers table manually. Not idea if this is feasible or the right way or doing things.
Any other solution?
In my opinion, don't expose your internal obligations with external solutions like implementing APIs etc ...
Just share the database to be accessible for B. In this way, the server administration is the only security concern and nobody knows how you work. In addition, It's not important how you implement the user authentication for each one (whether Windows Authentication or JWT) and has an independent infrastructure.
They are multiple solution to this one problem. It then end it really depends on your specific criteria.
You could go with:
B (intranet) website, reaching into the database and creating user as needed.
A (internet) website, having an API exposing the necessary endpoint to create user.
A (internet) website, having data migration running every now and then to insert users.
But they all comes with there ups and downs, I'll try to break them down for you.
API solution
Ups:
Single responsibility, you have only one piece of code touching this database which makes it easier to mitigate side effect
it is "future proof" you could easily have more services using this api.
Downs:
Attack surface increased, the API is on a public so subject to 3rd parties trying to play with it.
Maintain API as the database model changes (one more piece to maintain)
Not the fastest solution to implement.
Database direct access
Ups:
Attack surface minimal.
Very quick to develop
Downs:
Database model has to be maintained twice
migration + deployment have to be coordinated, hard to maintain.
Make the system more error prone.
Migration on release
Ups:
Cheapest to develop
Highest performance on inserts
Downs:
Not flexible
Very slow for user
Many deployment
Manual work (will be costly over time)
In my opinion I suggest you go for the API, secure the API access with OAuth mechanism. It OAuth is too time consuming to put in place. Maybe you can try some easier Auth protocols.

.NET Core Identity. Use one context or two?

I am struggling with a basic problem. Keep my ASP.NET Core Identity context separate from my business context or combine the two.
Philosophically, it seems good to keep them separate. If I want to upgrade .NET Core at some point in the future, I have fewer issues if my identity context is separate.
Practically, it seems like a good idea to combine them. Otherwise, I am creating another user table or doing some weird workarounds to get user information from navigation properties.
How do you handle Identity contexts in .NET Core?
In my experience, its a good idea to keep identity context, separate from your business logic.
You will probably have to do some expensive encryption and other things on your identity database. There may be no need to apply such encryption to your business data
You might choose to replace, have separate backup plans and recovery plans for your identity database. non-identity database may have its own plan.
More importantly, one fine day, you might decide and go with a 3rd party auth provider and forget about managing identity yourself. At which point, you may do some migration, throw away the identity database as it is no longer needed.
So, yes, best to use different context/database.
note : I am assuming, each context is linked to a different database. This is a given, but, for the sake of clarity, putting it here.

Cloud database for Azure multi-tenant application?

I am starting to port one old desktop single tenant application into the cloud and wish to hear what would be your recommendation about the databases for my cloud-based multi-tenant application?
My basic requirement is simple:
For each tenant, its data is separate to any other tenants' data. I can easily backup, restore, export the data for one single tenant without affecting other tenants.
I don't really want to care about multi-tenancy in the business logic code. It should look like a single tenant application behind the security layer, no tenant ID pass around etc.
Easy to query using some mature technology like LINQ.
Availability and scalability, of course, easy to set up replicas, fail-over and scaling up and down etc.
I have gone through some investigations about multi-tenant application development. I have noticed SQL databases from Azure and AWS are both very expensive(the cost for just SQL database instance is close to the license fee of the original application), so I definitely can't use separate SQL database instances for tenants.
Now I'm reading this book Developing Multi-tenant Applications for the Cloud, 3rd Edition, and it uses Azure Storage Service to implement multi-tenancy. I haven't finished the book yet, it seems you still have to handle the multi-tenancy by yourself and the sample code is already out of date.
I have seen lots of SO questions compare Azure Table Storage with MongoDB. The MongoDB is very new to me, not sure whether it could be easily used to fulfill my requirements?
And I have seen RavenDB as well, it does support multi-tenancy out of box. But I didn't see some good sample code about how to use it in Azure app development.
Hope to hear some good advices from awesome SO guys.
I would better opt with RavenDB on top of MongoDB. Even Raven is a new comer in to the game, it supports most of the features which traditional SQL supports.
Also to make up a decisions the volume of data you are dealing is a also a key decision pointer. Also the amount of traffic you are expecting.
Also keep in mind that operational costs and development efforts. HA and DR scenarios can be problematic when you use Raven or Mongo because of the fact that you need to host them. But when it comes to Azure Storage, it by defaults protects you to a maximum extent by maintaining 3 copies of information.
So I would suggest you to carefully make the trade offs and opt wisely based on your business needs, cost optimization, development and operational effort.
Having a single instance of your application for each tenant is a very expensive way to implement an application, however I realise that if an application was developed with a single tenant in mind, then the costs of changing over can be high.
First can we start out with why you have a desktop application connecting to a database at another location. The latency can really slow down an application. Ideally you would want a locally installed database and have it sync with the cloud DB, or add in appropriate caching into your application.
However the DB would still need to differentiate the clients.
Why do you need this to go to a cloud database? Is it for backup purposes, not installing a DB locally on a clients machine, accessing the same data from many machines or something else?
Unless your application is extremely large, I would recommend rewriting it for multi-tenant to one SQL Azure database. The architecture chosen at the beginning of the project doesn't suit your requirements now. As you expand you will run into further issues.

Looking for guidance on WF4

We have a rather large document routing framework that's currently implemented in SharePoint (with a large set of cumbersome SP workflows), and it's running into the edge of what SP can do easily. It's slated for a rewrite into .NET
I've spent the past week or so reading and watching WF4 discussions and demonstrations to get an idea of WF4, because I think it's the right solution. I'm having difficulty envisioning how the system will be configured, though, so I need guidance on a few points from people with experience:
Let's say I have an approval that has to be made on a document. When the wf starts, it'll decide who should approve, and send that person an email notification. Inside the notification, the user would have an option to load an ASP.NET page to approve or reject. The workflow would then have to be resumed from the send email step. If I'm planning on running this as a WCF WF Service, how do I get back into the correct instance of the paused service? (considering I've configured AppFabric and persistence) I somewhat understand the idea of a correlation handle, but don't think it's meant for this case.
Logging and auditing will be key for this system. I see the AppFabric makes event logs of this data, but I haven't cracked the underlying database--is it simple to use for reporting, or should I create custom logging activities to put around my actions? From experience, which would you suggest?
Thanks for any guidance you can provide. I'm happy to give further examples if necessary.
To send messages to a specific workflow instance you need to set up message correlation between your different Receive activities. In order to do that you need some unique value as part of your message data.
The Appfabric logging works well but if you want to create custom a custom logging solution you don't need to add activities to your workflow. Instead you create a custom TrackingParticipant to do the work for you. How you store the data is then up to you.
Your scenario is very similar to the one I used for the Introduction to Workflow Services Hands On Lab in the Visual Studio 2010 Training Kit. I suggest you take a look at the hands on lab or the Windows Server AppFabric / Workflow Services Demo - Contoso HR sample code.

In SAAS architecture, how do I handle db schema and MVC user logins for multi-tenants

Our requirement is something like this.
We are building a multi-tenant website in ASP.NET MVC, and each customer should be able to create their own users as per predefined user roles.
We are thinking about to create a schema for few tables which would be common for customers. So customer can login to system according to their schema logins and we need not to alter any queries to serve all of them.
We are referring http://msdn.microsoft.com/en-us/library/aa479086.aspx Shared Database, Separate Schemas.
Can someone suggest on following
1. After creating schema how to authorize user against a particular schema
2. Is this possible that without any changes in queries db can serve multi-tenants
Thanks in advance
Anil
After much research, I can say that, although it takes more development up front and more checks along the way, shared database and shared schema is the way to go. It puts a little bit of limits on how easily you can cater to a client's specific needs, but from my point of view SAAS isn't about catering to a single client's weird needs. It's about catering to the majority of clients. Not that it's a SAAS but take iPhone as an example. It was built to cater to the masses. Rather than focusing on doing everything it's built to be one-size fits all just by its simplicity. This doesn't help your case when it comes to authoriztion but it'll save you dev hours in the long run.
If you are asking this in the context of SQL Server authentication/authorization mechanism, i can asnwer this question with saying that every user has a default schema which helps query engine to find out required object in the database.
SQL Query Engine will look at the user's default schema first to find the required object (table). If it founds the object in user's schema then use it, otherwise goes to system default schema (dbo) to find it.
Check this article's How to Refer to Objects section to find out how it works. The article also has some information about security concepts related to schemas.

Resources