Background
In an ASP.NET site, I'm using a code documentation tool called Nocco. Nocco is a command line tool that you explicitly run on a particular code file to output an HTML rendered version of that code and it's documentation. I've currently setup some code in my Global_asax.Application_Start method to crawl through a couple directories and process all the code files in each directory.
Problem
Ultimately, putting it in Global_asax.Applicaton_Start means that it is building the Nocco documentation, which takes ~1 seconds per file, at the beginning of each session - not only once per deployment. This seems inefficient and an ultimate waste of the user's time while the page is loading.
Question
Is it possible to execute code internal to the ASP.NET application (such as a class method) as a post build event? I know that I could convert this part of my setup to a standalone application or even a batch script, but I've had this question for other circumstances as well and have wondered whether or not it's possible.
You could do your generation in Warm up script, here is the link for IIS 7.5
http://blogs.iis.net/thomad/archive/2009/10/14/now-available-the-iis-7-5-application-warm-up-module.aspx
or you can exclude code documentation functionality in separate assembly and include it in standalone app and call it as a external command from project Build events
Related
I need to run 4 background gobs for cleaning temp files and proccessing some files. I have chosen Quart.net for the job.
I have a Asp.Net website, which accepts uploading files that will be processed by the Quartz Jobs at night.
First i thought about making a console application for the Quartz jobs, keeping the website and the jobs totally decoupled.
But then, i've seen that i will need some config values (connectionstring and paths to files) that are on the asp.net web.config. So a question came to my mind:
Should i run the jobs through the asp.net instance or should i do this on a console application?
Furthermore, i want that when the Quartz jobs start running, the website show a special page (like "We are processing the files...).
What i care the most is the performance, i don't want the website to be affected by the Quartz jobs, neither the jobs' performance affected by the website.
So, what should i do? Have you done something like this and can give me an advice?
Should i run the jobs through the asp.net instance or should i do this on a console application?
If you want to have to manually trigger them each night, sure. But a console application using the host system's task scheduler seems like a more automated solution. A web application is more of a request/response system, it's not really suited for periodic or long-running actions. Scheduling some sort of background operation on the host, such as a scheduled console application or a windows service, would serve that purpose better.
Note that if it truly needs to be unattended and run even when there's nobody logged in to the server console, a windows service may be a more ideal approach than a console application.
i've seen that i will need some config values (connectionstring and paths to files) that are on the asp.net web.config
Console application have App.config files which serve the same purpose. You can use that.
i want that when the Quartz jobs start running, the website show a special page
You definitely want to keep the two de-coupled. But you may be able to accomplish this easily enough. Maybe have some sort of status flag in the database which indicates if any particular record is "currently being processed". The website can simply look for any records with that flag when a page loads and display that message.
There are likely a couple of different ways to synchronize status here, it doesn't really matter what you choose. What does matter is that the systems remain decoupled and that any status which is statically persisted is handled somewhat carefully to avoid an errant process from leaving an incorrect status. (For example, a background task sets a status of "processing" and then fails in some way. The website would forever indicate that it's processing.)
I have a web application that is used by several different clients. At the moment the process of updating their end with any changes is like so:
Publish/Compile App
Put relevant files into a zip (not web.config as different db paths for each client and don't want to overwrite)
Generate scripts on SQL Server for all Stored Procedures
Add to zip
Upload zip to Web
WPF App I created that runs from client server downloads zip, extracts files to web app folder and executes scripts for sql server stored procedures
Now this does work but it requires an IT guy at the client end to run the WPF App to update and it can be days before some of them get round to it. So what I would like to do is provide the ability to update the web app from WITHIN the web app. I know I can create a DLL to do the FTP, Extract etc, but how can I get this to display progress on the page?
Or if anyone has an alternative to updating the web app without the need for someone to access the server it's on great as this method makes it hard to let clients know when there is an update available.
You can use i.e.
[assembly: PreApplicationStartMethod(typeof(Your.Type), "MethodNameToCall")]
which is specified in the AssemblyInfo.cs file of a project to do some setup code whenever the application is deployed. This automatically runs on deployment and would allow you to do your copying/setup. You could probably run the WPF App from this code via
System.Diagnostics.Process
UPDATE:
Having re-read this post it seems clear to me that this is about moving from a WPF app to a web based app. Also it appears the poster just wants a method by which to signal back from the code that is updating the file system on the client side so....
Depending on how complex the input required is you may need one or more pages and a navigation system to go forward and back.
However once all input had been taken and the update commenced you have a couple of options - one 'hacky' the other not so.
1 - Hacky) Refresh the page using window.location javascript and setTimeout along with session tracking to update the progress of the threaded coded behind EWWWWW...
2) Create an ajax function using setInterval to poll the server (probably using a callable method decorated with the [WebMethod] attribute. This method can send back arbitrary data back to the ajax call which is then used to update the UI (perhaps using something like jqueryUI progress bar
NOTE: IF you are replacing anything in the bin, touching the web.config or in fact ANY .aspx page. Then you will restart the server automatically... If this is the case then you will have to code a seperate application that will update the other application from the outside + you should signal to any connected users that a shutdown will occur shortly and start blocking new users until the upgrade has completed.
I have an ASP.Net application that takes an unusually long time to start up the first time it is accessed. I did some tracing and I found that 57 seconds are spent in this function:
Boolean System.Web.Compilation.WebDirectoryBatchCompiler.CompileNonDependentBuildProviders(ICollection)
and that function in turn calls the following one 6 times:
Void System.Web.Compilation.WebDirectoryBatchCompiler.CompileAssemblyBuilder(AssemblyBuilder)
My question is what does System.Web.Compilation.WebDirectoryBatchCompiler.CompileAssemblyBuilder do? My web application is already compiled, I don't know why it is doing any kind of compilation work on start-up. Is this normal? Is there something going on that I don't know about?
There is quite a bit of bootstrapping that occurs when an ASP.NET application is started. This includes the worker process kicking in, assemblies loaded into the AppDomain, and also compilation of files in the current directory. This batch compilation process is per folder, which means if I request / for the first time, the batch compiler will scan the folder for supported types, compile them, and cache the result. This is only done within the root / folder. My first request to another /OffRoot folder will result in another batch compile.
If you have a precompiled site, the runtime still performs this type of scan but determines that it doesn't have to compile anything.
There is an important difference between a pre-compiled Website, and a compiled Web Application. A pre-compiled Website will have this first-instance compilation done ahead of time, so it need only load the assemblies into the AppDomain where it needs to. With a compiled Web Application, you have compiled base source code, but the views (.aspx) files are not compiled, so it still does that first-time compilation (dynamic compilation).
AFAIK, ASP.NET is based in the .NET world of doing things. Which, of course, means that there's actually two parts to compilation. One is that you compile the source into the .NET bytecode format. Second is the actual conversion into a format suitible to actually run on your system, usually in a Just-In-Time fashion. This is similar to Java, though there's a lot of lower-level differences.
The issue is that it's currently doing this JITting up-front, which is by design. It can take a while to get the ASP.NET app up and working, which is the minute you're seeing. I do believe there is a way to enable pre-JITting prior to having someone actually visit the site, but I'm not sure of the exact manner. Hopefully someone will post/link to the actual method for doing it.
Check index.aspx or default.aspx to see if there are any web applications. Sometimes it takes time to find the files, and compilation takes a while only first time.
Maybe i'm totally outdated but for last four years i've been using simple FTP upload feature while uploading new website even without building it within Visual Studio. Just bunch of ASPX and CS files as in Visual Studio.
I do understand that compiling the project will provide me with some security defence so ones who have access to the server won't be able to read those files in text editors and i will avoid first time compilation but is that so important?
I mean, you can always do a lot of harm if you have access to server that just reading CS files instead of DLL.
First time compilation usually takes no more than 1 minute just searching for compiled version of the site will take as much time.
Now i'm watching video on PluralSight which explains new MSDeploy tool available from ASP.NET and i can't see any good reason to use it.
So what's wrong with the old fashioned way of just sending files via FTP without compiling or using fancy tools?
I did speed test and with MSDeploy i can deploy a website twice faster than old-fashioned FTPing. So instead of 4 minutes it will take 2.
Now from another perspective, when i already have alive project on the web. In which have to change Default.aspx because i have typo in some html tag. Deployment via MSDeploy will take 10 times more than uploading one file
Maybe i miss something?
MSDeploy does things which FTPing to a site can't do. Need to change a machine.config? You're unlikely to have FTP write access to the folder which contains it. Want to change a server setting in a server-version-independent manner? FTP won't do that. Etc. FTP works fine for copying files to folders in which you have write access, but that's all it can do.
When you deploy a project you can do a lot of things with it.
You can set up a job in your deploy that packages all your javascript into one file and all your css into one file.
You can set up a job in your deployment that changes a bunch of config settings to match your production server settings (rather then development settings).
The idea of deployment is that you take your current development website and transform it into a production website without having to do any of that manually.
The most important thing is that when you can only deploy your website you will never forget to package your js or forget to remove some debugging code because you can't just sneakly update a single file.
Can we depend on the current working directory in ASP.NET code-behinds? Or, in other words, can we use relative paths, and be sure that they'll work?
If, in one page on a website, I set the current working directory to something specific, will it still be the same the next time another page on the website is loaded? When the same page on the website is loaded?
If I set the current working directory to something specific, in Page_Load(), can I be sure that it will still be the same by the time Page_PreRender() is called? Or could another page on the same website change it on me, in between? Could a page on a different website in the same application pool change it on me? A page in a different website in a different app pool?
In other words, what is the scope of the current working directory, in IIS? Is it specific to a page? Is it specific to a web site? Or is it shared among all pages in an app pool?
Where, among page, website, app pool, and server, are the boundaries that isolate different values of current working directory?
AppDomain.CurrentDomain.RelativeSearchPath will give you the physical path to the bin folder
Environment.CurrentDirectory is a simple wrapper around the GetCurrentDirectory and SetCurrentDirectory winapi functions. Indeed, trying to set the directory requires UnmanagedCode permissions. Whenever a function prevents your site from running in partial trust, you are right to be wary of depending on it. :)
From the SetCurrentDirectory documentation:
Changes the current directory for the current process.
The best explanation I can find that covers the relationship between the w3wp.exe process and an ASP.NET site is this answer. Any other page within your site could potentially change your page's current working directory. Any pages on any other site under the same application pool could potentially change your page's current working directory. These outside changes to the current working directory could happen at any time during your page's execution. On the other hand, a page on a site under a different application pool will not change your page's current working directory. The reason I say "could potentially" is that it gets even more complicated if you consider web garden scenarios, where there can be more than one process for a single ASP.NET site.
Now consider that SetCurrentDirectory is not thread safe:
Multithreaded applications and shared
library code should not use the
SetCurrentDirectory function and
should avoid using relative path
names. The current directory state
written by the SetCurrentDirectory
function is stored as a global
variable in each process, therefore
multithreaded applications cannot
reliably use this value without
possible data corruption from other
threads that may also be reading or
setting this value. This limitation
also applies to the
GetCurrentDirectory and
GetFullPathName functions. The
exception being when the application
is guaranteed to be running in a
single thread, for example parsing
file names from the command line
argument string in the main thread
prior to creating any additional
threads. Using relative path names in
multithreaded applications or shared
library code can yield unpredictable
results and is not supported.
Chances are that you don't want to depend on the current working directory. Having said that, given how foolish it is to rely on the current working directory, you can be reasonably certain that no other code will be touching it. :) A quick peek with Reflector shows that no .NET framework code changes it. A few functions do check it though, so watch out for those. If you control the deployment environment, you can ensure that your site runs in its own application pool. With proper synchronization technique, you should then be able to safely update the current working directory. I wouldn't consider it anything other than a hack though.
Links should be created relative to the site root using the tilde (~) operator:
Some Page
Within a server, an application pool completely isolates your site so that if some other site crashes on the same server, it won't bring down your site with it. IIS is pretty much site-specific with the added isolation benefits of app pools. I can see no practical use in trying to change a link on one page from the code-behind in another (or maybe I don't quite understand the question).
Here's a summary of the IIS architecture:
http://learn.iis.net/page.aspx/243/aspnet-integration-with-iis-7/