We have a website running on Windows Server 2008 R2, IIS7, SQL Server 2008 R2 and ASP.NET. We would like to upgrade to a faster server in order to speed up our DB queries.
We have backed up our website data from our old SQL Server and have restored it on our new server. For some reason, the DB performance on the new server is much slower, compared to the old one. This is in spite of the fact, that the new server has 4 times more RAM, a 30% faster SSD and and a 2 times faster CPU compared to the old one. Both servers have the exact same version of Windows, IIS, .NET and SQL Server.
The question is how can a significantly faster server result in slower DB performance? Any input would be appreciated.
Have u configured the indexes and tablespace for indexes. SQL. Server will need some time to generate indexes based on usage patterns. You can test this by running the same query a few times. It should speed up on the same requests.
I would assume the speed was attained throughway SQL optimisation.
Disk IO is usually the largest bottleneck in a database. Storing db files and log files on the same physical disks will slow it down. You can also move the temp db to its own physical drive/array. run a perfmon on the server look at average disk queue length if it is ever higher than n+1 where n its the number of physical disks then you have a bottleneck
We have found the problem... the "Power settings" on the server was set to "Balanced". We have set it to "High performance" and the DB is much quicker now
Sorry for the inconvenience
Related
We have a small (for now) Asp.Net MVC 5 website on a dedicated VPS. When I go to the server and fire-up task manager, I see that "SQL Server Windows NT - 64 bit" is using around 80% of CPU and 170MB of RAM and IIS is using 6% CPU and 400MB of RAM. Server Specs are:
CPU 1.90Ghz dual core
Memory 2GB
Windows Server 2012
SQL Server Express 2012
Disk Space: 25GB, 2.35 Free.
The database is not very big. Its backup is less than 10MB.
I have tried to optimize the website as much as I could. I added caching to a lot of controllers and implemented donut caching for quite a lot of controllers. But today, even though there were only 5 users online, our search wouldn't work. I restarted the Windows on the server and it started working but I got the high CPU usage the minute server started. Interestingly when I open the SQL Server Management Studio and try to get the report for top CPU-consuming queries it says that there are no queries currently consuming any CPU!!! But at the same time I can see that SQL server is consuming a lot of CPU. How can I examine what is taking all the CPU? Below is a picture from the server:
I was/am very careful with designing and implementing the website. All the database access is through latest version of Entity Framework. I just wonder if the server's specs are low. Any help would be very much appreciated.
Update:
Here's the result of the sp_who2 stored procedure.
This could happen if the memory set to use is more than the available memory on the box. The default memory setting of 2147483647MB. In our case the AWS box had only 30.5 GB so we changed the setting to 26GB and the CPU usage fell to 40%. You generally want to leave 20% of memory for OS and its operations.
I would agree running SQL Profiler to spot large query durations and large write operations. Try running perfmon and spotting any potential connection leaks (reclaimed connections).
Why is it that every time the server goes down, and asp.net restarts, the response time is SUPER FAST when it comes back up for a few minutes. I assume because everyone is off the server and I am one of the few (or only) people back on the server so quick?
I have discussed this with our developers and they say the response time is due to everyone on the server normally (200+ desktops) and when you are the only person on there, it flys. Really? Then does that mean we need newer, faster web servers?
I am not a programmer, but I think there may be two answers, one is what the devs say above is true, and two is the system is accumulating temp files of some sort and they get cleared out when the server crashes and then restarts.
How do we prove who might be right? Where does one start to look for asp.net bottlenecks?
windows server 2003
asp.net 3.0
iis6
12GB ram
sql server 2005 (db admin says there is no load issue on sql..)
Some very basic steps that you can follow and check if your server work on limits are:
First you download the Process Explorer from sysinternals and you run it to see two things.
Is your server work on their memory limit ?
If yes then what program eats the memory, usually SQL Server 2005 use a lot of memory for database cache, and this is done after many time of work.
Did the server use all of his computing power, if yes, check what program is the one that need all that computing power.
Now next step, download the TCPView from sysinternals, run it and see how many connections are done, how fast, etc... There you can see anomalies, or if the computer is also on their limit.
Final step is to defrag your disks.
Also have in mine that the asp.net session is lock the entire view on all users. So if you have one application on web, with too many users, and each user, or some users, make long time processing on their calls, then this can cause delay to all the users.
I have my IIS 6 running my website. It is on a Windows Server 2003 which has 4GB of RAM. I run SQL intensive code after the user submits a form (math statistics stuff). This process is not threaded (should it be, especially if 2 or more users run the same thing?). But my process seems to consume only a couple of GBs of memory and the server crawls. How do I get my IIS process to use nearly all the memory?
I see on other sites that its 2GB or 3GB allocated using boot.ini. But is there another way for the process to use memory? If I make it multithreaded, will there be a process for each thread?
If there is still memory free for IIS, it does not need more. Even if you give it more memory it will perform better. It is good to see some memory is not used and can be used for other processes as IIS. If you want to make is multi threading, it depends on what you do parallel if more memory is used, and if you gain any performance.
The basic here is to start with your requirements and see what peak use you can have. Then make a performance test to see if your machine can handle that load. To be sure you can handle some more do an other test to see the peek load your machine can handle. Then you will know if you have to invest any more time.
Check you database server to see if you bottleneck is not on that machine, most developers forget optimizing and maintaining their databases.
i have an ASP.NET application (website, catalog-like). No high load expected (not more than 50 concurrent users, lets say, and 95% just read - it is a website), but it has some specific needs, so i cant host it on shared hosting, and going for hosted virtual server. Configuration is like this - 2 GHz core (agregated 1:5), 1024 dedicated RAM, 30GB disk space. It is running Windows Server 2008 R2 Web.
Now the questions
it is a good idea to install and run SQL server 2008 R2 Express on this virtual ? Limitations are OK (anyway I have only 1 core and 1GB RAM, 10 GB per DB is perfectly fine), but how about performance ? I dont want it to use all available memory, because the ASP.NET app needs some too because of intensive caching (so maybe few hundreds MBs).
Can i set max server memory for SQL express to (lets say) 500 MB and expect fair performance? (only 1 small DB - for some time it will be under that 500 MB, and for a long time under 1 GB).
Additionally, are there any options i can use to minimize memory footprint/requirements of SQL server express ? (besides NOT installing reporting services, fulltext serach and replication - non of this is needed in my app)
1.it is a good idea to install and run SQL server 2008 R2 Express on this virtual ?
No. 30gb disc space area laready below what is recommended for windows - ading sql server wont help. On top, you never know how crappy the disc subsystem is, and fast discs are what makes ad atabase fly. DB perforamcne to a large part is controleld by the performance of the disc subsystem and that nromally sucks on virtual discs provided by non specialised hosters.
2.Can i set max server memory for SQL express to (lets say) 500 MB and expect fair performance?
Can i load only 100kg on a car and it is suitable= DEPENDS WHAT YOU DO. 500mb for a 10gb database may or may not be enough, depends on usabe patterns.
3.Additionally, are there any options i can use to minimize memory footprint/requirements of SQL
server express ?
This is so totalyl not hjow sql server works. Databases want big ram to use it as cache. Minimize it - sure. Then isntead of RAM you bburn IOPS which is a harder resource to get.
I personally think that aproduction server - even a vps - that is so much less powerfull thean a cheap laptop is totally the wrong approach.
I can't buy a the SQL server full/express plan on my hosting environment and I was thinking of using SQLCE with EF 4.0
expected user load is 1000-2000 per day.
"user load is 1000-2000 per day"
This isn't a particularly good measure of what load your database will be under.
You need to measure things like:
the number and complexity of your queries.
What kind of writes (insert/update/delete) will need to be performed.
How many of those a user might perform.
The amount of data being dealt with in the above queries.
Whether you can cache any of the results of queries.
For instance, I know of systems where having 1000 users required a cluster of high end servers to deal with the load.
If you can model what the performance is like for 50, 100, and 500 users - that could give you an idea of whether you can deal with this load.
FWIW: SQL Server Express Edition is free for commercial usage.
The sqlserverCe-dll (as well as the similar SQLite-dll) has 32bit native code inside. so there might be some 32/64bit issues when running on a 64bit system.
I am not shure if there is already a 64bit sqlserverCe-dll.
SQL Server Compact will run under medium trust under ASP.NET 4, and supports both x64 and x86 platforms. It is limited to max 256 concurrent connections. It is file based, and not quite as robust as SQL Server, and does not support recovery to a point in time.