I have a dataset in a Visual Studio 2010 Web App project which accesses the DB with a complex SQL statement. If I run the statement in SQL Management Studio directly, it loads in a less than a second. If however, I run it using the "Preview Data" button in the dataset designer, or I try to access it on a page (with a gridview for example), it takes over 40 seconds!
What steps should I take to track down what's causing this huge delay when working with the dataset?
There are two cases:
Problem is on Application Level
Problem is on Data Base level, actually a SQL query itself
So as first step try to exclude one of the cases, much easier from my perspectives is to debug SQL side:
Run Sql Profiler
Run query from Management Studio
Save profiler logs
Clear profiler logs
Run "Preview DataSet"
Compare execution logs and see whether any difference in SQL
Some steps to follow
Put a sql profiler to the database server to be sure what sql command the app is executing
The dataset class has very poor performance, you should try use a datareader
You can use a stopwatch instance to take diferent times to determinate where are the slow lines of code
Related
I'm trying to create a SQL Server database and then to use it to make a list with important stuff in it. I don't seem to make it work. I went to the toolbox, chose panel and then when I get into creating the server itself, it doesn't work as it is not showing me a server name and the database name needed for it to work. I tried to write them myself but it's still not working.
Both of them when I open the list don't show anything. I understood that is essential for it to show me things for it to work.
You could install LocalDB through the Visual Studio Installer, as part of the Data Storage and Processing workload, the ASP.NET and web development workload, or as an individual component.
After you install it, you could get it directly.
Here is the full documentation about LocalDB.
You can also refer to this set of ssdt database operation process to learn database.
Context:
I have a (legacy) ASP.NET app using a class lib which's data access is calling SPs using ad dal.dbml and its generated code.
I am using VS 2015 Enterprise update 1, have locally installed Microsoft SQL Server 2014 - 12.0.4213.0 (X64) (Build 10586) and SSMS 2014.
Question
When running the Web application the app inserts rows to a table which I recently implemented an insert trigger. How can I debug the actual trigger execution "triggered" by the Web application?
Note: I do know how to interactively debug SPs in SSMS. Now that is not I want, because It is nearly impossible to reproduce and parametrize what the web app does. So I would like to debug via the Web App.
Is this possible at all? (Seems to be a pretty usual need, though I unserstand it requires very complex debugging infrastructure and services)
What you do is you run Profiler to get the typical values being sent when the error occurs.
Then the easiest way I know of to troubleshoot something like this is to take the trigger code out of the trigger and put it in a script using #inserted (and/or #deleted if need be) instead of the pseudotables accessed by the trigger.
The insert the data the stored proc would insert into the table into the #inserted table. Then you can take the trigger step by step and even look at the results of a select that is used in an insert.
The most common error I have found in triggers is that they are set up to handle only one record in inserted rather than multiple records. SO if you find that what your proc would put in for the values you get is one record, that may be the issue. Or you may be missing a required field. IN that case you may need adjust the proc to put in the value or assign a default.
I have ASP.Net website that imports data from MSAccess database on some shared network location into the SQL Server. People before have implemented the ASP.Net timer to update the progress bar on UI and Actual import process takes place on separate thread.
When I try the process by launching website from visual studio it works fine for files on both local and network location. But when I host the website on IIS, it works fine only for the database files located on local drive, if any database file shared on network drive the first select query on the database table just does not returns control, system just stops there and nothing happens. And UI progress keeps on showing the same message.
I have given the required permissions to the folder, for the User Account I have used on Application Pool.
What could be the reason?
I got the solution, actually that select query was to get the count of records it was doing count(*) instead of count of any column name. Table had many columns, so it was taking long. I just put the column name instead of * and it is working fine now.
But interesting thing to notice is the amount of time it takes.
Our team has hundreds of integration tests that hit a database and verify results. I've got two base classes for all the integration tests, one for retrieve-only tests and one for create/update/delete tests. The retrieve-only base class regenerates the database during the TestFixtureSetup so it only executes once per test class. The CUD base class regenerates the database before each test. Each repository class has its own corresponding test class.
As you can imagine, this whole thing takes quite some time (approaching 7-8 minutes to run and growing quickly). Having this run as part of our CI (CruiseControl.Net) is not a problem, but running locally takes a long time and really prohibits running them before committing code.
My question is are there any best practices to help speed up the execution of these types of integration tests?
I'm unable to execute them in-memory (a la sqlite) because we use some database specific functionality (computed columns, etc.) that aren't supported in sqlite.
Also, the whole team has to be able to execute them, so running them on a local instance of SQL Server Express or something could be error prone unless the connection strings are all the same for those instances.
How are you accomplishing this in your shop and what works well?
Thanks!
Keep your fast (unit) and slow (integration) tests separate, so that you can run them separately. Use whatever method for grouping/categorizing the tests is provided by your testing framework. If the testing framework does not support grouping the tests, move the integration tests into a separate module that has only integration tests.
The fast tests should take only some seconds to run all of them and should have high code coverage. These kind of tests allow the developers to refactor ruthlessly, because they can do a small change and run all the tests and be very confident that the change did not break anything.
The slow tests can take many minutes to run and they will make sure that the individual components work together right. When the developers do changes that might possibly break something which is tested by the integration tests but not the unit tests, they should run those integration tests before committing. Otherwise, the slow tests are run by the CI server.
in NUnit you can decorate your test classes (or methods) with an attribute eg:
[Category("Integration")]
public class SomeTestFixture{
...
}
[Category("Unit")]
public class SomeOtherTestFixture{
...
}
You can then stipulate in the build process on the server that all categories get run and just require that your developers run a subset of the available test categories. What categories they are required to run would depend on things you will understand better than I will. But the gist is that they are able to test at the unit level and the server handles the integration tests.
I'm a java developer but have dealt with a similar problem. I found that running a local database instance works well because of the speed (no data to send over the network) and because this way you don't have contention on your integration test database.
The general approach we use to solving this problem is to set up the build scripts to read the database connection strings from a configuration file, and then set up one file per environment. For example, one file for WORKSTATION, another for CI. Then you set up the build scripts to read the config file based on the specified environment. So builds running on a developer workstation run using the WORKSTATION configuration, and builds running in the CI environment use the CI settings.
It also helps tremendously if the entire database schema can be created from a single script, so each developer can quickly set up a local database for testing. You can even extend this concept to the next level and add the database setup script to the build process, so the entire database setup can be scripted to keep up with changes in the database schema.
We have an SQL Server Express instance with the same DB definition running for every dev machine as part of the dev environment. With Windows authentication the connection strings are stable - no username/password in the string.
What we would really like to do, but haven't yet, is see if we can get our system to run on SQL Server Compact Edition, which is like SQLite with SQL Server's engine. Then we could run them in-memory, and possibly in parallel as well (with multiple processes).
Have you done any measurements (using timers or similar) to determine where the tests spend most of their time?
If you already know that the database recreation is why they're time consuming a different approach would be to regenerate the database once and use transactions to preserve the state between tests. Each CUD-type test starts a transaction in setup and performs a rollback in teardown. This can significantly reduce the time spent on database setup for each test since a transaction rollback is cheaper than a full database recreation.
The site I'm working on is running Windows Server 2003 and SQL Server 8 (2000?), and ASP.NET 3.5.
I need to have some sort of script or application run to import data from an FTP'd text file, into the database. There is already a site running on the machine, that uses the current database. Can I use a scheduled task to reliably kick off some sort of .aspx page that will import the data? Or is there a better approach?
What about making sure that no one else can access the page that runs the import? I don't want random users running the import!
Thanks in advance!
P.S. some processing needs to occur on the data before its inserted. i.e. lookups, conditionals, etc, so the DB tools aren't robust enough (I think). I hate DTS, and I SSIS is not available in this version I think.
If you want to have a C# App handle your import I would suggest a windows application (exe) w/o a form (better than a console app because it does not pop up any UI whenever it runs). Have it run every so often (every minute) by a scheduled task.
Why would you use ASP.NET? Depending on the complexity of the job you could either load it directly to the database (BULK LOAD) or use DTS (SQL Server 2000) or SSIS (SQL Server 2005/2008) if more complex processing is needed.
DTS and stored procedures in a job.
BCP and stored procedures in a job.
You say you need to do alot of lookups and conversions? SQL is good at that - and good at doing it fast. It can seem a little intimidating at first, but it's not hard.
run a BULK INSERT or bcp to import the data instead, see here http://msdn.microsoft.com/en-us/library/aa173839(SQL.80).aspx
I'll echo other people here - you don't want to have a scheduled task hit a web page. SQL Server provides some good data import options, or you could just write a simple windows program and run it as a scheduled task.
Another option would be to write a windows service that watches your FTP directory and does the import.
As others have said, probably a separate console application (triggered by a scheduled task) or a windows service would be the best option for this scenario.
On the other hand, if you already have all the required functionality available in the web app running on the server, then you could probably set up a scheduled task, that starts a script (VBscript, JScript), which in turn calls a page of the web app.
To have some sort of security (e.g. preventing that any user can call that page), you could add some code to the page, that checks if the page was called with http://localhost. This would at least prevent the page from being called from a remote client.