ASP.NET performance counter logging, reporting tool - asp.net

I have been looking for a solution to allow us to monitor our web servers performance counters over an asp.net website.
Is there an existing tool that I can make use of to accomplish this or will I need to roll my own?
The only solution I have found online is the use of perfmon to connect to the remote server, this I need to avoid.
The only criteria we need is the ability to select or configure what counter are used and a web interface to few these counters at a later date. We need a historical record or the servers performance.
We are using asp.net websites on IIS.
Thanks

Using perfmon remotely is the standard way to monitor performance counters remotely. This is done by sys admins across the globe.
Why do you need to avoid this?
However, your will need to roll out your own. I had done this in the past (users who could not figure out perfmon...).
In terms of historical data - you will need to poll the performance counters yourself and record the data somehow (database, flatfiles etc).
You can also setup a website to display current values, control and configure performance counters - the accounts the site runs under will required sufficient permissions, however.

Related

how to sync data between company's internal database and externally hosted application's database

My organisation (a small non-profit) currently has an internal production .NET system with SQL Server database. The customers (all local to our area) submit requests manually that our office staff then input into the system.
We are now gearing up towards online public access, so that the customers will be able to see the status of their existing requests online, and in future also be able to create new requests online. A new asp.net application will be developed for the same.
We are trying to decide whether to host this application on-site on our servers(with direct access to the existing database) or use an external hosting service provider.
Hosting externally would mean keeping a copy of Requests database on the hosting provider's server. What would be the recommended way to then keep the requests data synced real-time between the hosted database and our existing production database?
Trying to sync back and forth between two in-use databases will be a constant headache. The question that I would have to ask you is if you have the means to host the application on-site, why wouldn't you go that route?
If you have a good reason not to host on site but you do have some web infrastructure available to you, you may want to consider creating a web service which provides access to your database via a set of well-defined methods. Or, on the flip side, you could make the database hosted remotely with your website your production database and use a webservice to access it from your office system.
In either case, providing access to a single database will be much easier than trying to keep two different ones constantly and flawlessly in sync.
If a webservice is not practical (or you have concerns about availability) you may want to consider a queuing system for synchronization. Any change to the db (local or hosted) is also added to a messaging queue. Each side monitors the queue for changes that need to be made and then apply the changes. This would account for one of the databases not being available at any given time.
That being said, I agree with #LeviBotelho, syncing two db's is a nightmare and should probably be avoided if you can. If you must, you can also look into SQL Server replication.
Ultimately the data is the same, customer submitted data. Currently it is being entered by them through you, ultimately it will be entered directly by them, I see no need in having two different databases with the same data. The replication errors alone when they will pop-up (and they will), will be a headache for your team for nothing.

Monitoring performance counters

I have a smallish ASP.NET site. I want to instrument it, in particular logging the values of performance counters. Is there an external service which would do this? E.g. New Relic RPM, except I need the solution to work on a shared hosting provider where I don't have access to the hosting server.
Thanks

SQLite use it for websites, but not for client/server apps?

After reading this question and the suggested link explaining when is more appropriate to use SQLite vs another DB it's still unclear to me one simple thing, and I hope someone could clarify it.
They say:
Situations Where SQLite Works Well
Websites
SQLite usually will work
great as the database engine for low
to medium traffic websites...
...
Situations Where Another RDBMS May
Work Better
Client/Server
Applications...
If you have many
client programs accessing a common
database over a network...
Isn't a website also a client/server app?
I mean I don't understand, a website is exactly a situation where I have many client programs (users with their web browsres) concurrently accessing a common DB via one server application.
Just to keep it simple: at the end of the day, is it possible for instance to use this SQLite for an ecommerce site or an online catalog or a CMS site with about 1000 products/pages?
The users' web browsers don't directly access the database; the web application does. And normally the request/response cycle for each page the user views will be very fast, usually lasting a fraction of a second.
IIRC, a transaction in SQLite locks the whole database file, meaning that if a web app request requires a blocking transaction, all traffic will effectively be serialized. This is fine, for a low-to-medium traffic website, because many requests per second can still be handled.
In a client-server database application, however, multiple users may need to keep connections open for longer periods of time, and may also need to perform transactions. This is far less of a problem for bigger RDBMS systems because locking can be performed in a more fine-grained way.
SQLite can allow multiple client reads but only single client write. See: https://www.sqlite.org/faq.html
Client/server is when multiple clients do simultaneous writes to the database, such as order entry where there are multiple users simultanously inserting and updating information, or a multi-user blog where there are multiple simultaneous editors.
A website, in the case of read-only, is not client/server but rather simply a server with multiple requests. In many cases, a website is heavily cached and the database is not even accessed, or rarely.
In the case of a slightly used ecommerce website, say a few simultaneous shoppers, this could be supported by SQLite, or by MySQL. Somewhere there is a line where performance is better for a highly-concurrent database as opposed to SQLite.
Note that the number of products/pages is not a great way to determine the requirement for MySQL over SQLite, rather it is the number of concurrent users, and at what point their concurrent behavior experiences slowness due to waiting for locks to clear.
A website isn't necessarily a client server application in the context of use.
I think when they say website, they mean that the web application will directly manage the database. That is, the database file will live within the web site and will not be access via any other means. (A single point of access, put simply)
In contrast, a client/server app may have the web site accessing the data store as well as another web site, SOAP client or even a smart client. IN this context, you have multiple clients access one database (server). This is where the web site would become (yet another) client.
Another aspect to consider when constrasting the two, is what is the percentage of writes compared to reads. I think SQLite will perform happiply when there is little writing going on compared to the amount of reads. SQLite, I understand, doesn't do well in a multiple write scenario. It's intended for a single (handful?) process to be manipulating it.
I mainly only use SQLite on embedded applications. (iOS, Android). For larger, more complex websites (like your describing) I would use something like mySQL.

How to upgrade my asp.net app to support more users?

When an asp.net website has about 1,000 active users, it works good.
How should I do if the website has about 100,000 active users?
How to upgrade my asp.net app to support a larger number of users?
Changing the webApp's architecture?
Or buying more web servers?
I just wonder in the real-world, how do other people build an asp.net website supporting millions of users? What's the app architecture of a website to support that?
Any suggestion will be welcome.
First, make sure you're with a first rate hosting provider.
Second, download a performance profiler (I always suggest Red Gate Performance Profiler) and profile your app. Find the bottlenecks and eliminate them. Repeat until you get your desired performance metric.
If your application is querying a database or other web services, try to use asynchronous methods. Using asynch methods will free up the web server to handle a lot more client requests while it is waiting for a response from the database server or web service.
You say it "works good" at the moment. It's impossible to know what the point at which this may change will be wihtout knowing a whole lot more about the nature of your traffic, current set up, what else runs on the server, etc ,etc. It could be that it continues to "work good" with a million users as it is.
When you need to make changes (and slowly reducing performance will alert you), that's whne you need to worry. And then, as Justin says, knowing the potential bottelnecks will give you pointers as to what solution you need.
Buying more servers is one strategy. So is changing the architecture. The easiest and cost effective is throwing more servers at it. It does depend a little bit on the current application architecture, but nothing that can't be easily overcome.
What I suggest, is to load test your application. See what happens as you increase the active users. Who knows it might handle 100k active users, maybe it won't but at least you will know the tipping point.
In regards to what you should do, that really depends on your business needs. If your company has the $$ and this is a core product, then it makes sense to architect a robust application. If it's not, maybe throwing hardware at the problem is good enough.
It would also help if you could define an active user. Is it someone who is visiting your site and has a session? Is it 100k concurrent requests to the server...?
In terms of hardware scaling: Scaling Up or Scaling out
Software scaling - Profile your app

Monitoring load on ASP.NET Application

I am looking for ways to keep track of simultaneous users within an application. I cannot use IIS logs due to a load balancer that abstracts the users IP address. I am looking for a .NET code based solution or a configuration item, possibly with health monitoring to be able to track the "true" simultaneous user count.
I know that I can monitor the number of sessions, but that isn't really an ideal method to show, as it can be bloated based on the number of sessions with users abandoning their session.
There is a similiar question here: Tools and methods for live-monitoring ASP.NET web applications?
I found an advanced logging tool for debugging and monitoring .NET applications: SmartInspect. But I don't know if it meets your requirements.
What do you mean of "simultaneous users"? Perhaps you should monitor simultaneous TCP connections to your IIS application? Windows Performance Monitor tools should help you there.
Otherwise there is no sure way of telling how many users are using your application right now. If you can monitor number of sessions, then I'd suggest going with that. Just take into account the last modification time of the sessions, so you could get something like "active sessions in the last minute". That should give you a close measurment.
In the end we decided to use ASP.NET Performance counters, as well as generic information from the IIS Logs.
I parsed the information from both sources using the Microsoft Log Parser tool!
You just want to know the number of active users at a particular time? An easy option that omits inactive users as well as most bots would be to register the user as active through a JavaScript AJAX call on page load along with their SessionID. You can then purge old records from the log as you see fit. *Be careful of how you build your table's performance for read/write optimizations. ... just an idea off the top of my head.
We are using an expensive solution which is AVICode but it is great. You can monitor so many thing with that.

Resources