Understanding the JIT; slow website - asp.net

First off, this question has been covered a few times (I've done my research), and, for example, on the right side of the SO webpage is a list of related items... I have been through them all (or as many as I could find).
When I publish my pre-compiled .NET web application, it is very slow to load the first time.
I've read up on this, it's the JIT which I understand (sort of).
The problem is, after the home page loads (up to 20 seconds), many other pages load very fast.
However, it would appear that the only reason they load is because the resources have been loaded (or that they share the same compiled dlls). However, some pages still take a long time.
This indicates that maybe the JIT needs to compile different pages in different ways? If so, and using a contact form as an example (where the Thank You page needs to be compiled by the JIT and first time is slow), the user may hit the send button multiple times whilst waiting for the page to be shown.
After I load all these pages which use different models or different shared HTML content, the site loads quickly as expected. I assume this issue is a common problem?
Please note, I'm using .NET 4.0 but, there is no database, XML files etc. The only IO is if an email doesn't send and it writes the error to a log.
So, assuming my understanding is correct, what is the approach to not have to manually go through the website and load every page?
If the above is a little too broad, then can this be resolved in the settings/configuration in Visual Studio (2012) or the web.config file (excluding adding compilation debug=false)?

In this case, there are 2 problems
As per rene's comments, review this http://msdn.microsoft.com/en-us/library/ms972959.aspx... The helpful part was to add the following code to the global.asax file
const string sourceName = ".NET Runtime";
const string serverName = ".";
const string logName = "Application";
const string uriFormat = "\r\n\r\nURI: {0}\r\n\r\n";
const string exceptionFormat = "{0}: \"{1}\"\r\n{2}\r\n\r\n";
void Application_Error(Object sender, EventArgs ea) {
StringBuilder message = new StringBuilder();
if (Request != null) {
message.AppendFormat(uriFormat, Request.Path);
}
if (Server != null) {
Exception e;
for (e = Server.GetLastError(); e != null; e = e.InnerException) {
message.AppendFormat(exceptionFormat,
e.GetType().Name,
e.Message,
e.StackTrace);
}
}
if (!EventLog.SourceExists(sourceName)) {
EventLog.CreateEventSource(sourceName, logName);
}
EventLog Log = new EventLog(logName, serverName, sourceName);
Log.WriteEntry(message.ToString(), EventLogEntryType.Error);
//Server.ClearError(); // uncomment this to cancel the error
}
The server was maxing out during sending of the email! My code was fine, but, viewing Task Scheduler showed it was hitting 100% memory...
The solution was to monitor the errors shown by point 1 and fix it. Then, find out why the server was being throttled when sending an email!

Related

Faster way to achieve document conversion/preview task

I created win forms app to convert docx to html using pandoc, and a web browser control
to display the html file. This application is much needed for my colleagues in the university,
to preview docx files since we dont have MS Office access any more...
I tested this at my PC and it is working fine on each item click in the listbox,
it loads the preview in webbrowser quicky. But I just want to make it more quick, is there
any recommendations to make it faster (I can provide full code if needed), but the following
is the main listbox selected item changed event:
Also tell me which one is faster from: setting wb.DocumentText as blank or navigate it to about:blank page
private void lbFiles_SelectedIndexChanged(object sender, EventArgs e)
{
try
{
wb.DocumentText = "";
// Two string lists
SelectedFile = AllFiles[lbFiles.SelectedIndex];
NameOnly = AllNamesOnly[lbFiles.SelectedIndex];
if (NameOnly.EndsWith(".txt") || NameOnly.EndsWith(".docx"))
{
#region MediaFolder
if (Directory.Exists("MF")) Directory.Delete("MF", true);
Directory.CreateDirectory("MF");
#endregion
string cmd = "pandoc --extract-media ./MF \"" + SelectedFile + "\" -o " + "output.html";
File.WriteAllText("BatchFile.bat", cmd);
StartHidden("BatchFile.bat"); //Process object with: ProcessWindowStyle.Hidden; and with 3 seconds exit wait
wb.Navigate(Environment.CurrentDirectory + "\\" + "output.html");
}
}
catch(Exception ex) { throw ex; }
}
I tried hard with various solutions.
Most of them were not free, so were not useable, using pendoc as in OP, was not
feasible since it doesn't support the variety of fonts and formats
After going through many possible alternatives (Gembox, Spire.Doc, etc.),
I finally moved to Syncfusion community edition it is already free and its libraries allow to convert all major word processor based formats and its result was same as other non-free solutions. And works faster then pandoc.
Another thing to note is, I switched from WebBrowser to CefSharp as well, as its faster and lighter than WebBrowser, and it works better for pdf file
previews in browser (you can use zoom level, page number as part of URL)

System.Io.Directory::GetFiles() Polling from AX 2009, Only Seeing New Files Every 10s

I wrote code in AX 2009 to poll a directory on a network drive, every 1 second, waiting for a response file from another system. I noticed that using a file explorer window, I could see the file appear, yet my code was not seeing and processing the file for several seconds - up to 9 seconds (and 9 polls) after the file appeared!
The AX code calls System.IO.Directory::GetFiles() using ClrInterop:
interopPerm = new InteropPermission(InteropKind::ClrInterop);
interopPerm.assert();
files = System.IO.Directory::GetFiles(#POLLDIR,'*.csv');
// etc...
CodeAccessPermission::revertAssert();
After much experimentation, it emerges that the first time in my program's lifetime, that I call ::GetFiles(), it starts a notional "ticking clock" with a period of 10 seconds. Only calls every 10 seconds find any new files that may have appeared, though they do still report files that were found on an earlier 10s "tick" since the first call to ::GetFiles().
If, when I start the program, the file is not there, then all the other calls to ::GetFiles(), 1 second after the first call, 2 seconds after, etc., up to 9 seconds after, simply do not see the file, even though it may have sitting there since 0.5s after the first call!
Then, reliably, and repeatably, the call 10s after the first call, will find the file. Then no calls from 11s to 19s will see any new file that might have appeared, yet the call 20s after the first call, will reliably see any new files. And so on, every 10 seconds.
Further investigation revealed that if the polled directory is on the AX AOS machine, this does not happen, and the file is found immediately, as one would expect, on the call after the file appears in the directory.
But this figure of 10s is reliable and repeatable, no matter what network drive I poll, no matter what server it's on.
Our network certainly doesn't have 10s of latency to see files; as I said, a file explorer window on the polled directory sees the file immediately.
What is going on?
Sounds like your issue is due to SMB caching - from this technet page:
Name, type, and ID
Directory Cache [DWORD] DirectoryCacheLifetime
Registry key the cache setting is controlled by
HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\Lanmanworkstation\Parameters
This is a cache of recent directory enumerations performed by the
client. Subsequent enumeration requests made by client applications as
well as metadata queries for files in the directory can be satisfied
from the cache. The client also uses the directory cache to determine
the presence or absence of a file in the directory and uses that
information to prevent clients from repeatedly attempting to open
files which are known not to exist on the server. This cache is likely
to affect distributed applications running on multiple computers
accessing a set of files on a server – where the applications use an
out of band mechanism to signal each other about
modification/addition/deletion of files on the server.
In short try to set the registry key
HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\Lanmanworkstation\Parameters\DirectoryCacheLifetime
to 0
Thanks to #Jan B. Kjeldsen , I have been able to solve my problem using FileSystemWatcher. Here is my implementation in X++ :
class SelTestThreadDirPolling
{
}
public server static Container SetStaticFileWatcher(str _dirPath,str _filenamePattern,int _timeoutMs)
{
InteropPermission interopPerm;
System.IO.FileSystemWatcher fw;
System.IO.WatcherChangeTypes watcherChangeType;
System.IO.WaitForChangedResult res;
Container cont;
str fileName;
str oldFileName;
str changeType;
;
interopPerm = new InteropPermission(InteropKind::ClrInterop);
interopPerm.assert();
fw = new System.IO.FileSystemWatcher();
fw.set_Path(_dirPath);
fw.set_IncludeSubdirectories(false);
fw.set_Filter(_filenamePattern);
watcherChangeType = ClrInterop::parseClrEnum('System.IO.WatcherChangeTypes', 'Created');
res = fw.WaitForChanged(watcherChangeType,_timeoutMs);
if (res.get_TimedOut()) return conNull();
fileName = res.get_Name();
//ChangeTypeName can be: Created, Deleted, Renamed and Changed
changeType = System.Enum::GetName(watcherChangeType.GetType(), res.get_ChangeType());
fw.Dispose();
CodeAccessPermission::revertAssert();
if (changeType == 'Renamed') oldFileName = res.get_OldName();
cont += fileName;
cont += changeType;
cont += oldFileName;
return cont;
}
void waitFileSystemWatcher(str _dirPath,str _filenamePattern,int _timeoutMs)
{
container cResult;
str filename,changeType,oldFilename;
;
cResult=SelTestThreadDirPolling::SetStaticFileWatcher(_dirPath,_filenamePattern,_timeoutMs);
if (cResult)
{
[filename,changeType,oldFilename]=cResult;
info(strfmt("filename=%1, changeType=%2, oldFilename=%3",filename,changeType,oldFilename));
}
else
{
info("TIMED OUT");
}
}
void run()
{;
this.waitFileSystemWatcher(#'\\myserver\mydir','filepattern*.csv',10000);
}
I should acknowledge the following for forming the basis of my X++ implementation:
https://blogs.msdn.microsoft.com/floditt/2008/09/01/how-to-implement-filesystemwatcher-with-x/
I would guess DAXaholic's answer is correct, but you could try other solutions like EnumerateFiles.
In your case I would rather wait for the files rather than poll for the files.
Using FileSystemWatcher there will be a minimal delay from file creation till your process wakes up. It is more tricky to use, but avoiding polling is a good thing. I have never used it over a network.

Memory leak while sending response from rebus handler

I saw a very strange behavior in my rebus handler which is self hosted in exe. Right after sending response using bus.send method it adds up some memory consumed by process. I tried to look up object graph using memory profile and found that rebus is holding response message in serialized format somewhere.
Object graph was showing below hierarchy to the root.
System.Message --> CachedBodyMessage --> stream
Give me some pointers if anybody is aware of this thing.
I understand that a memory leak is a grave concern, but my belief is that it is unlikely that Rebus should contain a memory leak.
This belief is rooted in the fact that I have been running Windows Service-hosted Rebus endpoints in production for 1,5 years now, and several of them (e.g. the timeout managers) have sometimes been running for several months without being restarted.
I'd like to be absolutely bulletproof sure though, so I'm willing to investigate the issue you're reporting.
You're mentioning "CachedBodyMessage" - judging by the names of fields inside System.Messaging.Message, it sounds like it's something within MSMQ. To try to reproduce your issue, I coded the following test:
[Test, Ignore("Only works in RELEASE mode because otherwise object references are held on to for the duration of the method")]
public void DoesNotLeakMessages()
{
// arrange
const string inputQueueName = "test.leak.input";
var queue = new MsmqMessageQueue(inputQueueName);
disposables.Add(queue);
var body = Encoding.UTF8.GetBytes(new string('*', 32768));
var message = new TransportMessageToSend
{
Headers = new Dictionary<string, object> { { Headers.MessageId, "msg-1" } },
Body = body
};
var weakMessageRef = new WeakReference(message);
var weakBodyRef = new WeakReference(body);
// act
queue.Send(inputQueueName, message, new NoTransaction());
message = null;
body = null;
GC.Collect();
GC.WaitForPendingFinalizers();
// assert
Assert.That(weakMessageRef.IsAlive, Is.False, "Expected the message to have been collected");
Assert.That(weakBodyRef.IsAlive, Is.False, "Expected the body bytes to have been collected");
}
which verifies that the sent transport message is collected as it should (will only do this in RELEASE mode though, because of the way DEBUG mode holds on to object references within scope)
I'll try and run the TimePrinter sample now and leave it running for a while to see if I can reproduce the issue. If you stumble upon more information about e.g. exactly which objects are leaking, it would be very helpful.
Thanks again for taking the time to report your worries to me :)
Followup:
I've modified the TimePrinter sample so that it sends 50 msg/s and includes a 64 KB random string payload with each message, and I've tracked the memory usage for almost four hours now. As you can see, it does not look like memory is being leaked.
I'll leave it running the rest of the day, just to be sure.
Maybe you can tell me some more about why you suspected there was a memory leak in the first place?
Update:
As you can see from the trace, it has now been running for 7 hours and thus more than 1,200,000 messages containing more than 70 GB of data has been sent and consumed by the same process. If cached message bodies were leaking, I am pretty sure that we would have been able to see something rising on the graph.

Session is timing out on the drop down selected index change

Session is timing out on the drop down selected index change
20 minutes ago | LINK
Hello Everyone,
I am facing a weird problem here. I have a report page on which i am using a drop down list which has different years. When user select the year=2009, i am displaying report for 2009 data. The code is given below. The website is live on our web server now. The page access havy data, so sometime it takes one minute or more to load the report for selected year and when that is the case my session expires and user is getting redirected to the default page. But the same thing works fine in the solution in my machine and in one of our local server. It is just not working on our live server. Please help me by posting the solutions if you know any.
I have also placed this line in my web.config but it is not helping:
Code:
protected void ddlYear_SelectedIndexChanged(object sender, EventArgs e)
{
if (Session["UserId"] != null)
{
Session["IsDetailedReportLoaded"] = false;
Session["IsScoreCardLoaded"] = false;
Session["IsChartLoaded"] = false;
Session["IsReportLoaded"] = false;
string strYear = ddlYear.SelectedValue;
LoadReport(Convert.ToInt16(strYear));
lblYear.Text = strYear;
lblAsOf.Text = strYear;
lblYear.Text = ddlYear.SelectedValue.ToString();
lblAsOf.Text = ddlYear.SelectedValue.ToString();
ddlYearDetail.SelectedValue = ddlYear.SelectedValue;
ddlYearScorecard.SelectedValue = ddlYear.SelectedValue;
ddlYearGraph.SelectedValue = ddlYear.SelectedValue;
mpeLoading.Hide();
}
else
Response.Redirect("Default.aspx");
}
Thanks,
Satish k.
One possible problem could be that the web server is running out of memory and forcing the app pool to recycle. This would flush the InProc Session memory. You could try using Sql Session State instead and see if that resolves the problem. Try monitoring the web server processes and see if they're recycling quickly.
You can place a
if(Session.IsNew)
check in your code and redirect/stop code execution appropriately.
I would check the Performance tab in IIS to see whether a bandwidth threshold is set.
Right click on website in IIS
Performance tab
Check "Bandwidth throttling" limit
If a treshold is set you might be hitting the maximum bandwidth (KB per second) limit. Either disable bandwidth throttling, or increase the limit.

Silverlight MediaElement refusing to play audio

I am having the hardest time figuring this problem out. I have a Silverlight 4 application that loads audio and video files from URLs. The URLs are the same domain as the application is hosted on and it works great for video.
The URLs are actually asp.net mvc controllers that are responsible for reading the file from a shared location on and the server and serving back a filestream. The URLs look something like this:
http://localhost:31479/CourseMedia?path=\omnisandbox1\ILMSShare2\Demo-Fire+Behavior\media\Disclaim.wma&encrypted=False&id=00000000-0000-0000-0000-000000000000
If I put the URL directly into the browser the file loads and plays in windows media player just fine, and if I use a separate test silverlight project to load the url it also works, but for the life of me I can not get it to work properly in my main project.
This is the routine I use to actually do the source setting:
protected void SetPlayerURL(MediaElement player, string url)
{
if (player != null && url.Length > 0)
{
player.ClearValue(MediaElement.SourceProperty);
player.Source = new Uri(this.Packet.GetMediaUrl(url, false, Guid.Empty));
}
}
and the GetMediaURL function simply builds the URL format seen above:
public string GetMediaUrl(
string path,
bool encrypted,
Guid key)
{
StringBuilder builder = new StringBuilder();
builder.AppendFormat("http://{0}/CourseMedia?path={1}&encrypted={2}&id={3}",
this.Host,
System.Windows.Browser.HttpUtility.UrlEncode(path),
encrypted,
key);
return builder.ToString();
}
The request to the controller is never made for the media when it is audio. Seems odd to me as this exact code works fine for video. The MediaElement state never leaves "Closed" and the CurrentStateChanged,, MediaOpened, and MediaFailed events are never triggered.
I am at a loss!
Try setting ScrubbingEnabled of the MediaElement to false, there were some problems with Framework version 3.5 and audio and the workaround was setting that to false. Might be worth trying.
Also try capturing BufferingStarted, BufferingEnded, MediaEnded along with your MediaFailed and MediaOpened events. I'm curious if it is a buffering issue.

Resources