So we have reports which run, that our client runs multiple times daily and some of these reports hit very important tables which I'm not a fan of.
We currently have a small reporting table which is populated when a new user signs up. The user signs up and the aspnet_membership table is populated, then another sproc is run to populate the reporting table.
I've noticed we get some deadlocks during the registration process and I'm wondering if this is the cause, though I'm not sure since it's just doing Inserts which I don't think would cause deadlocks.
Anyway, as far as creating a reporting table, would it be better to just add a trigger onto the aspnet_membership table and when a new record is inserted we insert into our reporting table, or is having two sprocs run upon initial registration fine too?
I thought about an overnight process but the data needs to be real time.
Thanks in advance!
Could the reports at least be near real time? Could you have a SQL job that runs every 5 minutes perhaps that populates the reporting table?
Related
I'm relatively new to .NET and the controls it contains. I realize that this question is really open to opinion, but would appreciate the input of those of you who've had to support .NET pages where your database record sets were in the millions of records.
I have an Oracle table that will grow to several hundred thousand rows and likely to over 5-10 million over the next few years. Our current system that this is replacing is 6 million records. Not massive, but enough to impact page performance. We want to be able to display the data via an ASPX page with typical Next Page/Previous Page navigation. The user would have the ability to search/filter the data to limit their record set, but could still potentially have two to three pages (100-300 records) of results.
It is my understanding that the ListView object in .NET retrieves all of the records in your data source's SELECT statement and then the filtering/paging controls manipulate the results at the .NET server instead of at the database (until you issue a new databind()). This seems to be a potential performance issue in larger record sets that we are already seeing with just a few thousand records where the filter controls don't appear to filter until the entire underlying tables records are returned to .NET. If this is an incorrect assumption or understanding, please correct me.
Given the potential growth of our system, we're concerned about performance down the road. We're considering that we might be better off using a SELECT ... FROM ... where rownum>1 and rownum<=50 to work through our data pagination in our databind forcing the pagination to happen on the database side and executing on every page change, so that we're only working with a few hundred records at a time on the .NET side. Is this a false assumption regarding performance? Is there a performance issue with .NET performing the paging on large record sets?
My final question is if we're better off doing our pagination in the database rather than .NET, would we be better off storing our initial search results in a session specific temporary table and then using that as the basis of our paginated data rather than running the above query repeatedly against the master table as we move through the records?
Thank you in advance for everyone's input.
We have an Oracle SOA Composite that is deployed on Weblogic 11g. There is a trigger in a mySQL database that kicks off the composite. When it runs with a new entry the account name is not being populated so I added an additional query for the account name. I have included a screenshot of the check I have to query account name.
It appears the corresponding table is not getting updated as fast as the table that the trigger is on. I tried putting a wait in the composite and that didn't work. I also tried a wait with a while loop, which hung the composite. Does anyone have any suggestions on how to handle a situation like this?
Thanks,
Tom
This was actually an issue with the composite assignments were incorrect and the query did not return any data.
I have 5000 records I am calculating salary of one user and update his data in database. So it’s taking quit long to update 5000 records. I want to calculate all users’ salary first and then update to records in db.
Is there any other way we can update db in single click
It really depends on how your are managing your data access layer and what data you need for doing the calculation? Do you have all the data you need in just one table or for each record you need to fetch some other data from another tables?
One way is to retrieve each record and do the calculation in a transaction and then store it on the database. In this way, you can also take advantage of ajax UI to inform the user about the progress of calculation. In this way, you should use SqlDataReader to fetching the data as it is very optimized and has less overhead than using DataSet and DataTables and also you can prevent several type-castings. In addition, you can also make it optimized by taking advantage of TPL or make it configurable for fetching/updating N records each time. This approach works if you have the ID of the records. You also need to have a field for your records to track your calculations in case of any disconnection or crashes or iisreset execution so that you can resume the calculation instead of rerunning it again.
My team is thinking about developing a real time application (a bunch of charts, gauges etc) reading from the database. At the backend we have a high volume Teradata database. We expect some other applications to be constantly feeding in data into this database.
Now we are wondering about how to feed in the changes from the database to the application. Polling from the application would not be a viable option in our case.
Are there any tools that are available within Teradata that would help us achieve this?
Any directions on this would be greatly appreciated
We faced similar requirement. But in our case client asked us to provide daily changes to a purchase orders table. That means we had to run a batch of scripts every day to capture the changes occuring to the table.
So we started to collect data every day and store the data in a sparse history format in another table. So the process is simple here. We collect a purchase order details record in the against first day's date in the history table. And then the next day we compare the next day's feed record against the history record and identify any change in that record. If there is a change in the purchase order record columns we collect that record and keep it in a final reporting table which will be shown to the client.
If you run the batch scripts every day once and there will be more than one change in a day to a record then this method cannot give you the full changes. For that you may need to run the batch scripts more than once every day based on your requirement.
Please let us know if you find any other solution. Hope this helps.
There is a change data capture tool from wisdomforce.
http://www.wisdomforce.com/resources/docs/databasesync/DatabaseSyncBestPracticesforTeradata.pdf
It would it probably work in this case
Are triggers with stored procedures an option?
CREATE TRIGGER dbname.triggername
AFTER INSERT ON db_name.tbl_name
REFERENCING stored_procedure
Theoretically speaking, you can write external stored procedures which may call UDFs written in Java or C/C++ etc which can push the row data to your application in near real time.
Hai guys,
I ve developed a web application using asp.net and sql server 2005 for an attendance management system.. As you would know attendance activities will be carried out daily.. Inserting record one by one is a bad idea i know,my questions are
Is Sqlbulkcopy the only option for me when using sql server as i want to insert 100 records on a click event (ie) inserting attendance for a class which contains 100 students?
I want to insert attendance of classes one by one?
Unless you have a particularly huge number of attendance records you're adding each day, the best way to do it is with insert statements (I don't know why exactly you've got it into your head that this is a bad idea, our databases frequently handle tens of millions of rows being added throughout the day).
If your attendance records are more than that, you're on a winner, getting that many people to attend whatever functions or courses you're running :-)
Bulk copies and imports are generally meant for transferring sizable quantities of data and I mean sizeable as in the entire contents of a database to a disaster recovery site (and other things like that). I've never seen it used in the wild as a way to get small-size data into a database.
Update 1:
I'm guessing based on the comments that you're actually entering the attendance records one by one into your web app and 1,500 is taking too long.
If that's the case, it's not the database slowing you down, nor the web app. It's how fast you can type.
The solution to that problem (if indeed it is the problem) is to provide a bulk import functionality into your web application (or database directly if you wish but you're better off in my opinion having the application do all the work).
This is of course assuming that the data you're entering can be accessed electronically. If all you're getting is pieces of paper with attendance details, you're probably out of luck (OCR solutions notwithstanding), although if you could get muliple people doing it concurrently, you may have some chance of getting it done in a timely manner. Hiring 1,500 people do do one each should knock it over in about five minutes :-)
You can add functionality to your web application to accept the file containing attendance details and process each entry, inserting a row into your database for each. This will be much faster than manually entering the information.
Update 2:
Based on your latest information that it's taking to long to process the data after starting it from the web application, I'm not sure how much data you have but 100 records should basically take no time at all.
Where the bottleneck is I can't say, but you should be investigating that.
I know in the past we've had long-running operations from a web UI where we didn't want to hold up the user. There are numerous solutions for that, two of which we implemented:
take the operation off-line (i.e., run it in the background on the server), giving the user an ID to check on the status from another page.
same thing but notify user with email once it's finished.
This allowed them to continue their work asynchronously.
Ah, with your update I believe the problem is that you need to add a bunch of records after some click, but it takes too long.
I suggest one thing that won't help you immediately:
Reconsider your design slightly, as this doesn't seem particularly great (from a DB point of view). But that's just a general guess, I could be wrong
The more helpful suggestion is:
Do this offline (via a windows service, or similar)
If it's taking too long, you want to do it asynchronously, and then later inform the user that the operation is completed. Probably they don't even need to be around, you just don't let them do whatever functions that the data is needed, before it's completed. Hope that idea makes sense.
The fastest general way is to use ExecuteNonQuery.
internal static void FastInsertMany(DbConnection cnn)
{
using (DbTransaction dbTrans = cnn.BeginTransaction())
{
using (DbCommand cmd = cnn.CreateCommand())
{
cmd.CommandText = "INSERT INTO TestCase(MyValue) VALUES(?)";
DbParameter Field1 = cmd.CreateParameter();
cmd.Parameters.Add(Field1);
for (int n = 0; n < 100000; n++)
{
Field1.Value = n + 100000;
cmd.ExecuteNonQuery();
}
}
dbTrans.Commit();
}
}
Even on a slow computer this should take far less than a second for 1500 inserts.
[reference]