There are several option for storing the users info when dealing with ASP.NET Membership providers. I would like to ask if they are comparable in terms of performance. Especially of ActiveDirectoryMembershipProvider and SqlMembershipProvider if when there will be e.g. 100 000 users recorded.
Both Providers can handle the workload. Question is if the infrastructure below can handle it. An AD-Server with 100.000 accounts should be big enough to handle it.
So, the real question in my eyes is, do you write the app for an intranet and want to provide SSO functionality? Then, by all means, go with ActiveDirectory!
Your question is unanswerable, as "performance" depends greatly upon many factors.. for instance, network speed, network latency, network saturation, the power of your AD server vs your SQL Server, the disk subsystems in use in either, etc...
There is no way to say one way or the other without thoroughly evaluating each environment, and even at that point, you should just benchmark each and determine what works best for you.
In most cases, though.. the decision between sql vs ad has nothing to do with performance, and has to do with the features offered by each. I would strongly doubt you have 100,000 users in your active directory, as that would cost a millions of dollars in licensing costs.
Related
I have a general question about database hosting in relation to WCF and ASP.NET. We are currently developing a new online web application in ASP.NET, which gets/posts data to our MSSQL database with a WCF service (three tier infrastructure).
Now later in development we will be launching our website and hosting it on an external provider. We are unsure whether to keep the database for the website internally on our own servers, or host it externally along with our provider (they offer database hosting options as well).
If we hosted it externally, we would obviously back it up internally using batch scripts etc.
One major concern is the security of the database, as we are only a small business with not much experience in web security architecture. Due to this, we are leaning towards an external provider for both the website and database, who would obviously have experience and the equipment to manage such things.
Could you please offer some opinions on the matter?
Thanks!
There's always a risk associated with handing sensitive data off to an outside party, and trusting them to be as secure as you need.
There's no mystery here, someone at the provider will have enough access to look at your data if they really wanted to. So it all boils down to how sensitive is your data? Is there bank account info or social security numbers? For these reasons, our company cannot hand off such data to an outside party.
I'm a little confused though about one thing: if you could potentially host the database server when you go to production, why couldn't you host the website as well? Is it a matter of being able to handle high traffic?
Update in response to your comment:
It sounds like your data is somewhat sensitive, not highly sensitive. In which case if we're not being totally bonkers pedantic here, then you can reasonably assume a reputable hosting company will take the proper measures to secure your data, and from the sounds of it, they're probably more capable in this respect then your own company (not because you're careless or wet behind the ears, just because they would have considerable experience in this area where your company does not).
Now for the performance and hardware setup part if your comment... if you dont have the hardware or network infrastructure to meet your requirements, then you either a) upgrade your own infrastructure and hire the appropriate personnel to set it up and maintain it or b) you pay someone else to do it. Sounds like a no-brainer for you guys to go with option b.
I have been looking for a solution to allow us to monitor our web servers performance counters over an asp.net website.
Is there an existing tool that I can make use of to accomplish this or will I need to roll my own?
The only solution I have found online is the use of perfmon to connect to the remote server, this I need to avoid.
The only criteria we need is the ability to select or configure what counter are used and a web interface to few these counters at a later date. We need a historical record or the servers performance.
We are using asp.net websites on IIS.
Thanks
Using perfmon remotely is the standard way to monitor performance counters remotely. This is done by sys admins across the globe.
Why do you need to avoid this?
However, your will need to roll out your own. I had done this in the past (users who could not figure out perfmon...).
In terms of historical data - you will need to poll the performance counters yourself and record the data somehow (database, flatfiles etc).
You can also setup a website to display current values, control and configure performance counters - the accounts the site runs under will required sufficient permissions, however.
I have an ASP.NET web application hosted in a web-farm environment, and I need a way to be able to indicate how much a user is using my database.
There are several reasons for this, and I mention a couple. First, because I pay for the database space per month, I want to have a reasonable way to charge my users. Second, it would be nice to know (again in a per user basis) when to inform the user to upgrade his subscription.
I don't have enough experience in RDBMS, I come from a different background (windows applications, graphics), and so I can't figure out if this is possible, and if it is, how this can be handled: through SQL or ASP.NET (some tool, library, etc.).
If you, also, have some other idea, I'd like to hear what you suggest.
Any other advice on this subject, including good places to learn, would also be appreciated.
It depends on your schema. If you use a database-per-user multi-tenant schema then is very easy, the size of the database is the size consumed, and is really easy to measure and, morei mportantly, enforce. If you use a shared database schema then you'll need to keep track in each table of what rows belong to which user and keep accounting. Both measurement and enforcement are more difficult and there is no general answer, you will have to properly code for accounting the bytes used and to enforce any max size per user constraint.
When an asp.net website has about 1,000 active users, it works good.
How should I do if the website has about 100,000 active users?
How to upgrade my asp.net app to support a larger number of users?
Changing the webApp's architecture?
Or buying more web servers?
I just wonder in the real-world, how do other people build an asp.net website supporting millions of users? What's the app architecture of a website to support that?
Any suggestion will be welcome.
First, make sure you're with a first rate hosting provider.
Second, download a performance profiler (I always suggest Red Gate Performance Profiler) and profile your app. Find the bottlenecks and eliminate them. Repeat until you get your desired performance metric.
If your application is querying a database or other web services, try to use asynchronous methods. Using asynch methods will free up the web server to handle a lot more client requests while it is waiting for a response from the database server or web service.
You say it "works good" at the moment. It's impossible to know what the point at which this may change will be wihtout knowing a whole lot more about the nature of your traffic, current set up, what else runs on the server, etc ,etc. It could be that it continues to "work good" with a million users as it is.
When you need to make changes (and slowly reducing performance will alert you), that's whne you need to worry. And then, as Justin says, knowing the potential bottelnecks will give you pointers as to what solution you need.
Buying more servers is one strategy. So is changing the architecture. The easiest and cost effective is throwing more servers at it. It does depend a little bit on the current application architecture, but nothing that can't be easily overcome.
What I suggest, is to load test your application. See what happens as you increase the active users. Who knows it might handle 100k active users, maybe it won't but at least you will know the tipping point.
In regards to what you should do, that really depends on your business needs. If your company has the $$ and this is a core product, then it makes sense to architect a robust application. If it's not, maybe throwing hardware at the problem is good enough.
It would also help if you could define an active user. Is it someone who is visiting your site and has a session? Is it 100k concurrent requests to the server...?
In terms of hardware scaling: Scaling Up or Scaling out
Software scaling - Profile your app
I've used asp.net profiles (using the AspNetSqlProfileProvider) for holding small bits of information about my users. I started to wonder how it would handle a robust profile for a large number of users. Does anyone have experience using this on a large website with large numbers of simultaneous users? What are the performance implications? How about maintenance?
Running against this via SQL I have found is a bit tricky, but i have worked with clients that have scaled it up to a few hundred properties, and 10K+ users without difficulty. Granted not a lot of users but it is working thus far.
I think it really depends on the specific project, and your exact needs when it comes to working with the profile information. Do you need to query on it regularly via SQL? Do you just need to for user display only, these types of things might help provide a more solid answer for your needs.
The SQL provider performance is more closely correlated to big iron throughput. Performance is more or less directly proportional to a single SQL Server's ability to handle the number of queries. Scale-up is the only option, so as such its not really five-nines robust out the box.
You'll have to figure out if you need scale-out performance and availability e.g. through partitioning, replication, redundancy etc. and at what cost to performance. Some of the capabilities are are possible as is - the current implementation is more aimed at the middle-market and enterprise.
Good thing is you can put your own implementation of the profile provider - then attach it to services and systems with capabilities outlined above.
We wrote a custom authn,authz and profile provider and strapped it to large AD/LDS LDAP cluster across 3 datacenters. We're in the Comscore Top 10 - so you could say that we deal with a good slice of internet every day. 1000's of profile queries per second and 100'millions of profiles - it can scale with good planning, engineering and operations.