I'm working on a issue with heavily fragmented indexes on a large production DB. I've pretty much identified the indexes that are heavily fragmented, including those that are not really being used. I plan to rebuild some and remove others. So my next step is to devise a before and after time test.
One of the symptoms of this is SSRS reports taking about an hour to render. I'm new to Reports Services. I can see that a report is being embedded in the ASPX page using a ReportViewer control with the ServerPort ReportPath and ReportServerUrl properties set. My problem is trying to figure out how to time the display of the report from start to finish in the code-behind. I can write the start time to a file in the Page_Load but I can't figure out how to record the end time... Pre-Render could just hang and I'm not sure if this is the only page lifecycle event I can tap into to record this. Should I use a Windows Service, and if so, how would I trigger/record the start and end times that way?
I'd really appreciate some feedback on if this is possible via the display page's code-behind.
Have you tried looking in the Reporting Services execution logs. That contains several timed events such as data retrieval time, render time, process time and the actual start and end time. Check ReportServer.dbo.Executionlog and ReportServer.dbo.Catalog
To check the log settings. Connect to your SSRS server using SQL Server Management Studio (not the database engine, select Reporting Services from the connection dialogue). Once connected, right-click the server and choose properties. On the logging tab you will see the number of days history to retain. By default this is 60 days.
Assuming that is no zero then you can do a simple query like this to get the report execution details.
SELECT *
FROM ReportServer..ExecutionLog e
JOIN ReportServer..Catalog c
ON e.ReportID = c.ItemID
WHERE c.name = 'myReportName'
Related
We have a system running Windows Server 2008R2 x64 and SQL Server 2008R2 x64 with SSRS installed/configured. This is a shared reporting server used by a large number of people, with some fairly large inefficient databases (400-500gb of data ish), and these users use the system to generate ad-hoc reports based of a reporting model that sits on top of the aforementioned databases. Note that the users are using NTLM to logon and identify for running reports.
Most reports are quick, but if you are running a report for 1 or 2 years worth of data, they can take a while to return (5minutes ish). This is fine for most users, however some of the users are stuck behind a proxy, which has a connection timeout set at 2minutes. As SSRS 2008R2 does not seem to send back a "keep-alive" signal (confirmed via wireshark), when running one of these long reports the proxy server thinks the connection has died, and as such it just gives up and kills the connection. This gives the user a 401 or 503 error and obviously cancels the report (the incorrect error is a known bug in SSRS which Microsoft refuse to fix).
We're getting a lot of flak from the user's about this, even though it's not really our issue..so I am looking for a creative solution.
So far I have come up with:
1) Discovering some as yet unknown setting for SSRS that can make it keep the connection alive.
2) installing our own proxy in between the users and our reports server, which WILL send a keep-alive back (not sure this will work and it's a bit hacky, just thinking creatively!)
3) re-writing our reports databases to be more efficient (yes this is the best solution, but also incredibly expensive)
4) ask the experts :) :)
We have a call booked in with Microsoft Support to see if they can help - but can any experts on Stack help out? I appreciate that this may be a better question for server fault (and I may post it there) but it's a development question too really :)
Thanks!
A few things:
A. For SSRS overall on it's service:
I personally use a keep alive service as I believe the default recycle is 12 hours for SSRS server. I use a tool someone turned me onto called 'VisualCron' that can do many task processes automatically. You can also just make a call in a WCF service or similar to. Basically I know the first report from a user for the day is generally slow. Usually you need to hit http:// (servername)/ReportServer to keep it alive.
B. For cachine report level items:
If this does not help I would suggest possibly caching DataSets when possible. Some people have data that is up to the moment but for a lot of people that is not the case. You may create a shared dataset in SSRS and then cache that on a schedule. So if you have domain like tables that only need to be updated once in a blue moon put them there. Same with data that is nightly or in batches. If you are transactional based shop that is up to the moment this may not help but for batch based businesses this can help tremendously.
You can also cache the reports for their data as a continuation of this. Under 'Manage' drop down for a report when in the /Reports landing page you can set the data to run under a specific schedule. You can also set a snapshot which is an extension of this as it executes with some default parameters set on a schedule and is a copy of the report when it was ran.
You are mentioning ASP.NET so I am not certain how much some of this will work if you are doing this all through a site you are setting up internally as a pass through. But you could email or save files on a schedule as well through SSRS's subscription service.
C. Change how you store your data for reporting.
You can create a Report Warehouse of select item level values of queries. Create a small database that is just a few recent years of data and only certain fields and certain tables. Then index it to death and report off of that. In my experience this method will fly in terms of performance but it does take the extra overhead of setting it up. Generally most companies will whine about this but it often takes a single day to set up and then you create one SSMS job that does it all nightly or an SSIS package then you don't worry about it. I like this method as I know my data is not being reported off of production and is isolated personally.
After many years of programming, I need to do something asynchronously for the very first time (because it takes several minutes and the web page times out -- don't want the user waiting that long anyway). This action is done by only a few people but could be done a few times per day (for each of them).
From a "Save" click on an ASP.NET web page using LINQ, I'm inserting a record into a SQL Server table. That then triggers an SSIS package to push that record out to several other databases around the country.
So..
How can I (hopefully simply) make this asynchronous so that the user can get on with other things?
Should this be set up on the .NET side or on the SQL side?
Is there a way (minutes later) that the user can know that the process has completed and successfully? Maybe an email? Not sure how else the user can know it finished fine.
I read some threads on this site about it but they were from 2009 so not sure if much different now with Visual Studio 2012/.NET Framework 4.5 (we're still using SQL Server 2008 R2).
It is generally a bad idea to perform long-running tasks in ASP.Net. For one thing, if the application pool is recycled before the task completes, it would be lost.
I would suggest writing the request to a database table, and using a separate Windows Service to do the long-running work. It could update a status column in the database table that could be checked at a later time to see if the task completed or not, and if there was an error.
You could use Service Broker on the SQL side; it'sa SQL Server implementation of Message Queueing.
Good examples here and here
What you do is create a Service Broker service and define some scaffolding (queues, message types, etc).
Then you create a service "Activation" procedure which is basically a stored procedure that consumes messages from queue. This SP would receive for example a message with an ID of a record in a table, and would then go on and do whatever needs to be done to it, perhaps sending an email when it's done, etc.
So from your code-behind, you'd call a simple stored procedure which would insert the user's data into a table, and send a message to the queue with for e.g the ID of the new record, and then immediately return. I suppose you should tell the user upfront that this could take a few minutes and they'll receive an email, etc.
The great thing about Service Broker is message delivery is pretty much guaranteed - even if your SQL Server falls over right after the message is queued, when you bring it back up the activation SP will just kick off again, so it's very robust.
I need some information from you.I have used session.TimeOut=540 in application.Is that effects on my Application performance after some time.When number of users increases its getting very slow. response time nearly more that 2 minutes for a button click also.This is hosted in server in Application pool .I don't know about Application pool much.If Session Timeout is the problem i will remove it.Please suggest me the way to for more users.
Job Numbers,CustomerID,Tasks will come from one database.when the user click start Button then the data saved in another Database.I need this need to be faster for more Users
I think that you have some page(s) that make some work that takes time, or for some reason or a bug is keep open for more time than the usual.
This page is keep lock the session and hold the rest page from response because the session holds all the pages.
Now, together with the increase of the timeout this page is lock everything and here is you response time near to 2 minutes.
The solution is to locate the page that have the long running problem and fix it or make it faster by optimize the process, or if this page must keep the long time running, then disable the session for that one.
relative:
Web app blocked while processing another web app on sharing same session
What perfmon counters are useful for identifying ASP.NET bottlenecks?
Replacing ASP.Net's session entirely
Trying to make Web Method Asynchronous
Does ASP.NET Web Forms prevent a double click submission?
About server
Now from the other hand, if your server suffer from hardware, or bad setup then here is one other answer with points that you need to check to make it faster.
Find out where the time is spent
add the StopWatch in the method which you said "more that 2 minutes for a button click". you can find which statment spent the most time.
If it is a query on DB that cost time. Check your sql statement.
are you using "SELECT Count(*)" instead of "SELECT Count(Id)"? the * is always slower. also, don't try "SELECT * FROM...."
Use cache.
there are many ways to do cache. both in ASPX pages and your biz layer.
the OutputCache is the most easy way.
and also, cache the page (for example a blog post) on the first time when a user visit it.
Did you use memory paging?
be careful when doing paging on gridview or other list. If you just call DataSource=xxx and DataBind(), even with PagedDataSource, this is likely a memory paging. It cost a lot of performance. Please use stored procedures to do paging.
Check your server environment
where did you deploy the website? many ISP will limit brandwide and IIS connection count and also CPU time to your account.
if you have RD access to your server. you can watch CPU and memory usage to see if they are high when many user comes to your site. If the site is slow and neither CPU nor memory useage is high, it may be a network brandwide problem.
Here are some simple steps to narrow down the issue -
1) Get HTTPWatch (theres a free Basic version) available and check whats really taking time from an end user perspective. Look at number of requests, number of resources downloaded, and the payload. If there is nothing to worry move on to next
2) If its not client, then its usually the processing time on the server. Jump on to DB first - since this is quite easier to eliminate quickly. Look at how many DB calls are made (run profiler in staging or dev) and see if there are any long running queries, missing indexes or statistics, and note the IO. If all is well, move on
3) Check your app code. You could get on with VS.NET in build profiler or professional tools such as Ants. If code is fine then its your network or external calls that you make, check your network bandwidth. If you still cannot narrow down, check your environment/hardware
The best way to get to it is to apply load - You could use simple tools such as ab.exe (that comes as part of Apache Web server) to have concurrent hits on your server and run the App, DB profilers in the background to get to the issue.
Hope this helps!
I've got a report in ASP.NET app. When I trying to generate it from the browser it crashes with DB timeout error, but when i'm executing exact same query in SQL Management Studio it shows the result set within 5 seconds.
Query is written unclean SQL in the code-behind file (no ORMs are used), it's parameters are from the web form, so i know what exact generated query will be.
What can be the cause of the problem?
First, use SQL Profiler to attach to the database and see exactly what query is being sent. Use that for other testing.
Second, set your connection timeout to something ridiculous like 300 seconds. Then do the same thing for the command timeout.
Third, make sure both your application and your management studio instance are talking to the same database... Preferably with the exact same user rights.
Run again. Then run it again.
It's possible that the database is taking time to do an initial load (hence the first query taking awhile) and the query through management studio is executing while the database is still "hot" so to speak.
Finally, you say that management studio shows results within 5 seconds.. Is it 5 seconds for it to start populating the query results window or 5 seconds for the entire query to finish executing. Those can be radically different times.
SQL Server profiler is great for profiling SQL Server performance for web apps. However, when I'm testing my webapp I'd like a summary of database hits/duration per page.
Does anybody know of any utilities for giving you this kind of information?
If you want duration per page, I'd recommand Google Analytics.
If you want a summary of database hits (ie, you run three procedures during one page load so you want to show a count of three) then I would recommend adding auditing code to your sprocs.
Alternately (though more expensively in terms of processing) you could turn on either SQL Profiler or SQL Trace and then track the database hits that way to perform statistical analysis on them.
I would recommend setting a data access routine that will be used for all the site.
This routine/class/or whatever you like could log in the database or in a log all the "hits", their duration, error (is any), timeout, etc.
If you program it properly, you will be able to know how many DB hit / page load, avg(DBHit) + you will get as a free bonus the "longest SProc, shortest, more often called".
The positive side of this is that you don't need to modify any stored proc and you can have a nice little "wrapper" around your access to the DB.
For the "Duration per page", if you go with google analysis you will not be able to merge the information back with what you got on the database server. So I would recommend logging each access to a page in the DB.
Then you can infer that Page1.StartTime = getdate(), Page1.EndTime = (page2.Starttime-1 or session.log_off_time) for example. [This is a little basic but according to your environment you can improve it].