We are running a reporting web application that allows the user to select a few fields and a crystal report is generated based off of the fields selected. The SQL that is generated for the most complex report will return the data in < 5 seconds, however it takes the report and average of 3 minutes to run, sometimes longer causing a time out. We are running VS2010. The reports are basically set up out of the box with no real manipulations or computations being done, just displaying the data in a nice format. Is there anything we can try to speed it up, pre-loading a dummy report to load the dlls, some hack to make crystal run faster, anything?
EDIT: Code Added to show the databinding
protected void Page_Load(object sender, EventArgs e)
{
if (!Page.IsPostBack)
{
string strFile = Server.MapPath(#"AwardStatus.rpt");
CrystalReportSource1.Report.FileName = strFile;
DataTable main = Main();
CrystalReportSource1.ReportDocument.SetDataSource(main);
CrystalReportViewer1.HasCrystalLogo = false;
CrystalReportSource1.ReportDocument.ExportToHttpResponse(CrystalDecisions.Shared.ExportFormatType.PortableDocFormat, Response, false, "pmperformance");
}
}
private DataTable Main()
{
Guid guidOffice = Office;
CMS.Model.ReportsTableAdapters.ViewACTableAdapter rptAdapter = new CMS.Model.ReportsTableAdapters.ViewACTableAdapter();
Reports.ViewAwardedContractsDataTable main = new Reports.ViewAwardedContractsDataTable();
if (Office == new Guid())
{
IEnumerable<DataRow> data = rptAdapter.GetData().Where(d => UserPermissions.HasAccessToOrg(d.guidFromId, AuthenticatedUser.PersonID)).Select(d => d);
foreach (var row in data)
{
main.ImportRow(row);
}
}
else if (guidOffice != new Guid())
{
main = rptAdapter.GetDataByOffice(guidOffice);
}
else
{
main = new Reports.ViewACDataTable();
}
return main;
}
private Guid Office
{
get
{
string strOffice = Request.QueryString["Office"];
Guid guidOffice = BaseControl.ParseGuid(strOffice);
if (!UserPermissions.HasAccessToOrg(guidOffice, AuthenticatedUser.PersonID))
{
return Guid.Empty;
}
else
{
return guidOffice;
}
}
}
protected void CrystalReportSource1_DataBinding(object sender, EventArgs e)
{
//TODO
}
This may be a bit flippant, but possibly consider not using crystal reports... We had a fair bit of trouble with them recently (out of memory errors being one), and we've moved off to other options and are quite happy...
Here's what I would do:
Put clocks from the time you get the field choices from the user, all the way to when you display the report. See where your processing time is going up.
When you look at the clocks, there can be various situations:
If Crystal Reports is taking time to fill the report, check how you're filling it. If you're linking the report fields directly to your data table, CR is probably taking time looking up the data. I suggest creating a new table (t_rpt) with dynamic columns (Field1, Field2,..FieldN) and pointing your report template to that table. I don't know if you're already doing this.
If it's taking time for you to lookup the data itself, I suggest creating a view of your table. Even though a memory hog, this will make your lookup quick and you can delete the view once you're done.
If it's none of the above, let us know what your clocks show.
In terms of loading any large amount of data, you'll always want to use a stored procedure.
Outside of that, you WILL see a delay in the report running the first time the Crystal DLLs load. Yes, you can preload them as you mentioned and that will help some.
Related
I'm obtaining data from an external service and inserting it into an inMemory table (Table_movieTemp), which I use as a datasource on a form (Form_MovieSearch_ds):
[FormControlEventHandler(formControlStr(Form_MovieSearch, FormCommandButtonControl1), FormControlEventType::Clicked)]
public static void FormCommandButtonControl1_OnClicked(FormControl sender, FormControlEventArgs e)
{
FormDataSource Form_MovieSearch_ds = formRun.dataSource();
System.Collections.IEnumerable data = ClassLibrary1.Program::CallRestService();
var enumerator = data.getEnumerator();
while(enumerator.moveNext())
{
MovieRentalService.TmdbMovie item = enumerator.get_current();
Table_movieTemp.Description = item.Description;
Table_movieTemp.ReleaseDate = today();
Table_movieTemp.Title = item.Title;
Table_movieTemp.Rating = item.Rating;
Table_movieTemp.existsAlready = Table_Movie::exist(item.Title);
insertList.add(movieTemp);
}
ttsbegin;
insertList.insertDatabase();
ttscommit;
while select Table_movieTemp
{
info(strFmt("Name: %1,", Table_movieTemp.Title));
}
The while loop I used purely to prove the inserts were succesful.
Afterwards I figure I can call the executeQuery on the form which has my temptable as datasource:
FM_MovieSearch_ds.executeQuery();
This did not work and when I searched google I found a solution where I have to pass the TempTable buffer so that I can link it using 'setTmpTable'.
So I added the following call before calling executeQuery():
formRun.BindTable(movieTemp);
Function on my form:
public void BindTable(FM_MovieTemp _movieTempBuffer)
{
_movieTempBuffer.setTmpData(_movieTempBuffer);
}
Now my code compiles and does not generate runtime errors either, but I still don't see any data. Could someone advice what I miss or do wrong?
The use of in-memory tables in forms has been around for 25 years, and you will find several uses in the standard application.
From the CustVendAgingStatistics form:
void calcAgingStatistics(boolean _research)
{
CustVendAgingStatistics custVendAgingStatistics = CustVendAgingStatistics::construct(linkedCustVendTable, graphData.loadDefName(), graphData.perInvoiceDate());
custVendAgingStatistics.calcStatistic();
tmpAccountSum.setTmpData(custVendAgingStatistics.tmpAccountsum());
if (_research)
{
tmpAccountSum_ds.research();
}
}
Another nice example is found here.
The method:
Insert the records in a separate method, return the local buffer.
In the calling method call setTmpData with the return value.
Research the datasource
In your code I see the use of InsertRecordList, do not use that on in-memory temporary tables, it makes no sense.
Also _movieTempBuffer.setTmpData(_movieTempBuffer) does not do anyting useful as it operates on itself.
Also good style is not do a lot in onClicked methods and other event methods, call proper methods to do the hard work instead.
I have a SharePoint2010 site that I have created an ASP.NET menu control for.
The ASP.NET menu's content is initially empty, and on Page_Load I load its content from standard HTML files on the server:
protected void Page_Init(object sender, EventArgs e)
{
string MenuPath = (string)ConfigurationSettings.AppSettings["RootMenuPath"].ToString();
Menu1.Items[0].ChildItems[0].Text = File.ReadAllText(MenuPath + "\\About.htm");
//etc...
}
I notice this is a horrible way to do things. It hits the disk every single time a user loads a page.
How can I either:
a) Cache the code and asp.net menu item so that it stays in memory?
b) Use another method to ensure it isn't loaded from the disk?
Thanks
You can wrap data load into property and use at least page Cache there:
readonly object cacheLock = new object();
string AboutHTM
{
get
{
if (Cache.Get("page.about") == null)
lock (cacheLock)
{
if (Cache.Get("page.about") == null)
Cache.Insert(File.ReadAllText(MenuPath + "\\About.htm"));
}
return Cache["page.about"].ToString();
}
}
You could indeed use the cache or some variable that is initialized only once in Application_Start and reused later but I am afraid that you are doing some premature optimization here. You probably shouldn't be doing it unless you have identified that this is a bottleneck for your application performance. Reading files from disk is a fast operation especially if they are small.
If possible, I would store the menu data in an XML file, and cache the XML file.
XmlDocument xDoc = new XmlDocument();
if (Cache.Get("MenuData") == null)
{
xDoc.Load(Server.MapPath("/MenuData.xml"));
Cache.Insert("SiteNav", xDoc, new CacheDependency(Server.MapPath("/MenuData.xml")));
}
else
{
xDoc = (XmlDocument)HttpContext.Current.Cache.Get("MenuData");
}
myGridView.DataSource = LinqDataSource works but only for select. I have edit and delete columns and when I try to use them I get errors about events not getting caught. Specifically I've seen OnRowDeleting, but I'm sure there are others that need to be wired up.
myGridView.OnRowDeleting = ??
I can't seem to find anything on the LinqDataSource that looks like what I need :(
edit: here is some some sample code illustrating what I'm doing.
protected virtual void OnRowDeleted(Object sender, GridViewDeletedEventArgs e)
{
// it means the last row was deleted
if (fieldGridView.Rows.Count == 1)
{
fieldGridView.DataSourceID = null;
fieldGridView.DataSource = new[] { new FormField { Required = false } };
fieldGridView.AutoGenerateDeleteButton = false;
fieldGridView.AutoGenerateEditButton = false;
}
}
protected void InsertButton_Click(object sender, CommandEventArgs e)
{
// pull data out of footer row and insert it into the DB
if (fieldGridView.DataSource == null || fieldGridView.DataSource.GetType() != LinqDataSource1.GetType())
{
fieldGridView.DataSource = LinqDataSource1;
fieldGridView.AutoGenerateDeleteButton = true;
fieldGridView.AutoGenerateEditButton = true;
}
}
Also note that I've statically set the OnRowDeleting event in the markup and the error went away. However, the Linq data source is not getting set back properly. The code is updating it, it's just sticking for whatever reason, so what's happening is that when I re-enable the delete column the data source ends up still being the temp data source I assigned to it (which is just a list), so when I hit delete it was blowing up. Since I've now statically defined the callback, it gets called, but no delete happens because the data source isn't being switched back to the linq data source properly.
so in essence, my issue is that when I dynamically set the data source to the list on the last delete, the change is sticking, but when I dynamically set the data source to the linq data source on the first insert, the change is not sticking. The strangest part of it is that other changes I'm making do stick, just not the data source. I'm enabling the autogenerated edit and delete columns and that change is being reflected immediately.
On your linq datasource, did you remember to explicitly set the enable flags for delete and insert?
<asp:LinqDataSource ID="MyLinqDataSource" EnableDelete="true" EnableUpdate="true" EnableInsert="true" runat="server" ....
Ultimately I was not able to fix this specific issue, but I did find a workaround for my larger issue.
In essence what I've done is to create two gridviews with different data sources, created a custom delete button for each row, and then swap the grids out accordingly. Not exactly a great solution, but it suffices for what I'm doing.
As for why my original solution of swapping the datasources doesn't work, I don't know. Most of the projects I've been on have used non-MS tools for tabulated data so I'm not even sure where to look. However, if anyone has any good ideas as to why I've encountered this behavior, I'm all ears.
Here is what I am trying to do. I have a TreeView server side control (asp.net 2.0) and I need the user to be able to add nodes to it, then after all the nodes desired are added, the data should be saved to the database.
Here are some things I would like to pay attention to:
1) I don't want to save the tree data each time the new node is added, but rather keep the data in session until the user decides to save the entire tree. The question here is: can I bind the tree to ArrayList object and keep that object in session (rather than keeping the whole tree in session)? Then each time the node is added I will have to rebind the tree to the ArrayList rather than database.
2) I wish to minimize ViewState, any tips? What works best: compressing viewstate or keeping it all on the server at all times?
Thanks!
Use TreeNodeCollection as your internal array to hold in either ViewState or Session. Here's a rough mock-up of an approach you can use; far from perfect, but should set you on the right track.
TreeView tv = new TreeView();
// Button click event for 'Add Node' button
protected void AddNode(object sender, EventArgs e)
{
if (SaveNodeToDb(txtNewNode.Text, txtNavUrl.Text))
{
// Store user input details for new node in Session
Nodes.Add(new TreeNode() { Text = txtNewNode.Text, NavigateUrl = txtNavUrl.Text });
// Clear and re-add
tv.Nodes.Clear();
foreach (TreeNode n in Nodes)
tv.Nodes.Add(n);
}
}
public bool SaveNodeToDb(string name, string url)
{
// DB save action here.
}
public TreeNodeCollection Nodes
{
get
{
if (Session["UserNodes"] ! = null)
return (TreeNodeCollection) Session["UserNodes"];
else
return new TreeNodeCollection();
}
set
{
Session["UserNodes"] = value;
}
}
I have a problem which is perfectly described here (http://www.bokebb.com/dev/english/1972/posts/197270504.shtml):
Scenario:
Windows smart client app and the CrystalReportViewer for windows.
Using ServerFileReports to access reports through a centralized and disconnected folder location.
When accessing a report which was designed against DB_DEV and attempting to change its LogonInformation through the CrystalReportViewer to point against DB_UAT, it never seems to actually use the changed information.
It always goes against the DB_DEV info.
Any idea how to change the Database connection and logon information for a ServerFileReport ????
Heres code:
FROM A PRESENTER:
// event that fires when the views run report button is pressed
private void RunReport(object sender, EventArgs e)
{
this.view.LoadReport(Report, ConnectionInfo);
}
protected override object Report
{
get
{
ServerFileReport report = new ServerFileReport();
report.ObjectType = EnumServerFileType.REPORT;
report.ReportPath = #"\Report2.rpt";
report.WebServiceUrl = "http://localhost/CrystalReportsWebServices2005/ServerFileReportService.asmx";
return report;
}
}
private ConnectionInfo ConnectionInfo
{
get
{
ConnectionInfo info = new ConnectionInfo();
info.ServerName = servername;
info.DatabaseName = databasename;
info.UserID = userid;
info.Password = password;
return info;
}
}
ON THE VIEW WITH THE CRYSTAL REPORT VIEWER:
public void LoadReport(object report, ConnectionInfo connectionInfo)
{
viewer.ReportSource = report;
SetDBLogon(connectionInfo);
}
private void SetDBLogon(ConnectionInfo connectionInfo)
{
foreach (TableLogOnInfo logOnInfo in viewer.LogOnInfo)
{
logOnInfo.ConnectionInfo = connectionInfo;
}
}
Does anyone know how to solve the problem?
I know this isn't the programatic answer you're looking for, but:
One thing that helps with this sort of thing is not creating your reports connected to the database directly, but first you create a "Data Dictionary" for Crystal Reports (this is done in the Report Designer). Then you link all of your reports to this dictionary which maps the fields to the proper databases.
Done this way, you only have one place to change the database schema/connection info for all reports.
Also, in your report designer set the report to not cache the results (sorry I don't remember the exact option). Reports can either have their initial results included or not.
Don't you have to browse all the "databaseTable" objects of the report to redirect the corresponding connections? You'll find here my VB version of the 'database switch' problem ...
In your CrystalReportViewer object you should set
AutoDataBind="true"