I have an ASP.NET project which is a front-end to a database. In addition to the large tables, the DB contains a few small tables to help normalize the larger tables with common values. I have a VB.NET project which loads the smaller tables into memory, using "Shared" (i.e., "static" in C#) member variables, and uses them. I have a call to load the tables in Global.asax - Application_Start. This works for a while. That is, Application_Start runs when I first run my project, loads the cached values, and will correctly keep them in memory for a while.
What I'm seeing (when running my project via Visual Studio 2008 Debugger, hosted locally) is:
A) The Application_Start code will run more than once. Not in a row, but after the user has navigated to some other pages, I'll see (my breakpoint in) another call to initialize the cache, coming form Application_Start. Is it expected?
B) The "Shared" variable that was set to True when the cache was initialized is now False again (which should only happen when the class is first loaded). Similarly, all the data that was chached is no longer present. That is, it looks like VB is unloading all the Shared members. Is this expected?
If these are the expected behaviors, is there a way to do what I want? The code is in a module that is also used by other (non-ASP.NET) projects, and seems to work correctly for them. I'd rather not have to duplicate this functionality for something specific to ASP.NET, but would like to know what my options are. Thanks for any advice.
Here is an article you might find helpful about Caching Data at Application Startup. It sounds like you are doing everything right, but Application_Start should only get called once, unless some external change happens that restarts the app pool, but in this case i would think you would get detached from the debugger (assuming you are attached to the app pool process for your asp.net application).
Related
I have a basic webforms asp.net site. Currently its working on pre-created sql tables and I have to manually triger it to update data. Moving towards a live deployment though, I'd like to make it more comfortable.
How would I make it so that whenever the server software loads it up, the first thing it does before accepting any requests is to run an initialization sub? Just so I can make sure all the tables are there and if not I would create them etc.
Also, I'd like to run another sub that would trigger the data update periodically every few hours. I was thinking that if I could get my initialization sub, I could just spawn a background thread to deal with that but if theres a built-in option, I'll take it.
whenever the server software loads it up
In asp.net, you have the global.asax file - open the code behind for that and look at the possible overrides. Among them will be:
protected void Application_Start()
This always runs when the application starts up and you could use this to check the DB.
If you're in an "in-house" environment where there's a single live database server and a single live application server, then it should be ok to assume that the database is deployed before the application and you won't need this. If you're providing an application to a third-party or providing it on the web, then this is a good place to check. How you generate the DB is up to you, but checking here is a good idea. You could also have a (hidden) admin page on your site that checks the database connection etc.
trigger the data update periodically
This won't be built-in to asp.net as asp.net waits for requests and responds to them. There are ways around this, but generally triggered externally to the application. The easiest is a simple windows scheduled task that hits a page to trigger the check.
This is what's referred to as "deployment".
If your web site is deployed via MSI, this step should be done in MSI.
If your web site is deployed via Visual Studio "publish" option, this is where you need to create tables.
Some applications indeed do as you say, e.g.: create SQL tables on the 1st run. The problem with this approach is that your app will need sa rights, instead or simple read/write. This could lead to security issues.
Code which runs on web site launch (which is where initialization belongs to) is located in global.asax in:
protected void Application_Start()
I'm building a website (for personal use, low load) and instead of using an Access or MySQL database for data storage I'm thinking of having one XML file that I load and parse on Application_Start and then keep in memory (in static objects). The website then do reads and writes against these in-memory objects and I will finally persist all data to the XML file on Application_Disposed.
I'm aware that I'll need to make reading/writing thread-safe, but besides that, does anyone see any problem using this approach?
Yes, I see a big problem: There are a number of reasons to why the whole application might die without you knowing about it, and without your data being saved to that xml file.
You'll find Application_Dispose can get fired multiple times (so might don't be the best place to dispose your DI containers etc) whereas Application_End will only fires once (you can prove this by adding logging)
https://bytes.com/topic/asp-net/answers/561768-event-sequence
https://learn.microsoft.com/en-us/previous-versions/ms178473(v=vs.140)?redirectedfrom=MSDN
It seems VS2019 IIS Express doesn't seem to call Application_End as it should when you stop debugging, But IIS will.
I've run into a problem I just can't seem to solve. The background: Years ago, I developed a web site for one of my customers using ASP 2.0 and Ajax. One function of the web site is to produce customer invoices, on demand. Their in-house production system is written in Visual Foxpro 8 with SQL Server 2005 on the back end. Since I already had an invoice generation object that would produce a PDF file, I rolled up a COM EXE and created a COM wrapper for use in my ASP page. It works great for years, but now we're trying to move the page to a different location and things aren't working so great.
The network techs have re-produced the environment and the rest of the web site runs perfectly. I can even instantiate the COM object (I've logged the init and all is well), but the very first call to one of the objects methods results in an "Exception from HRESULT: 0x80010105 (RPC_E_SERVERFAULT))". I'm just plain stuck!
Here's what does work:
1) Using a visual foxpro program from the same server I can instantiate the object, call the generate invoice method and produce the PDF - no problem whatsoever.
2) Using VBScript from a very simple ASP page I can use Server.CreateObject() to instantiate the object and successfully generate the invoice from there.
So far I know:
1) the object is registered correctly and is launching as the proper user, with all of the rights needed to do it's business.
2) the wrapper for the COM EXE and COM object versions are matched.
I apologize for the long post. To make a long story short: Why would ASP.NET not be able to make a call to any method of a VFP COM object after it's been instantiated successfully?
Thanks in Advance - I'm seriously stuck on this one.
Erik
For those running into the same situation, adding the COM EXE to the Data Execution Prevent (DEP) exception list allowed the calls to the objects methods.
Did you compile it as an EXE, a runtime DLL, or Multi-Threaded DLL. Additionally, a problem I've had before is that of single or multiple instances of an OlePublic dll entry. To confirm, modify your project... then from the Top Menu, click "Project", then "Project Info". On the third tab is "Servers" showing the available servers in your project. On the right side of it is "Instancing" this would be either single or multiple. Sometimes, just throwing this to single has solved instances for me. However, if multi-threading, make sure you have the multi-threaded dll too.. VFP9T.DLL
--- EDIT PER RESPONSE...
Since you compiled it as an EXE, Its probably going to show up as a distributed COM object. Go to the Windows "Start", and run "DCOMCNFG" which will bring up the DCOM Configuration manager. You'll have to scroll down the list of items until you find your exe (OlePublic class name) and might have to revise permissions, who can launch / access / execute... apply impersonate, etc...
FOR TESTING ONLY --- You could set this COM server as impersonate Administrator -- JUST TO TEST and see if any errors or not. If no errors, then you'll know its a permissions thing, then change it back to a more restricted user.
Can we depend on the current working directory in ASP.NET code-behinds? Or, in other words, can we use relative paths, and be sure that they'll work?
If, in one page on a website, I set the current working directory to something specific, will it still be the same the next time another page on the website is loaded? When the same page on the website is loaded?
If I set the current working directory to something specific, in Page_Load(), can I be sure that it will still be the same by the time Page_PreRender() is called? Or could another page on the same website change it on me, in between? Could a page on a different website in the same application pool change it on me? A page in a different website in a different app pool?
In other words, what is the scope of the current working directory, in IIS? Is it specific to a page? Is it specific to a web site? Or is it shared among all pages in an app pool?
Where, among page, website, app pool, and server, are the boundaries that isolate different values of current working directory?
AppDomain.CurrentDomain.RelativeSearchPath will give you the physical path to the bin folder
Environment.CurrentDirectory is a simple wrapper around the GetCurrentDirectory and SetCurrentDirectory winapi functions. Indeed, trying to set the directory requires UnmanagedCode permissions. Whenever a function prevents your site from running in partial trust, you are right to be wary of depending on it. :)
From the SetCurrentDirectory documentation:
Changes the current directory for the current process.
The best explanation I can find that covers the relationship between the w3wp.exe process and an ASP.NET site is this answer. Any other page within your site could potentially change your page's current working directory. Any pages on any other site under the same application pool could potentially change your page's current working directory. These outside changes to the current working directory could happen at any time during your page's execution. On the other hand, a page on a site under a different application pool will not change your page's current working directory. The reason I say "could potentially" is that it gets even more complicated if you consider web garden scenarios, where there can be more than one process for a single ASP.NET site.
Now consider that SetCurrentDirectory is not thread safe:
Multithreaded applications and shared
library code should not use the
SetCurrentDirectory function and
should avoid using relative path
names. The current directory state
written by the SetCurrentDirectory
function is stored as a global
variable in each process, therefore
multithreaded applications cannot
reliably use this value without
possible data corruption from other
threads that may also be reading or
setting this value. This limitation
also applies to the
GetCurrentDirectory and
GetFullPathName functions. The
exception being when the application
is guaranteed to be running in a
single thread, for example parsing
file names from the command line
argument string in the main thread
prior to creating any additional
threads. Using relative path names in
multithreaded applications or shared
library code can yield unpredictable
results and is not supported.
Chances are that you don't want to depend on the current working directory. Having said that, given how foolish it is to rely on the current working directory, you can be reasonably certain that no other code will be touching it. :) A quick peek with Reflector shows that no .NET framework code changes it. A few functions do check it though, so watch out for those. If you control the deployment environment, you can ensure that your site runs in its own application pool. With proper synchronization technique, you should then be able to safely update the current working directory. I wouldn't consider it anything other than a hack though.
Links should be created relative to the site root using the tilde (~) operator:
Some Page
Within a server, an application pool completely isolates your site so that if some other site crashes on the same server, it won't bring down your site with it. IIS is pretty much site-specific with the added isolation benefits of app pools. I can see no practical use in trying to change a link on one page from the code-behind in another (or maybe I don't quite understand the question).
Here's a summary of the IIS architecture:
http://learn.iis.net/page.aspx/243/aspnet-integration-with-iis-7/
I am developing a component to create bespoke BulkImport functionality in ASP.NET. Underline this component will be using SqlBulkCopy class. There will be different file formats. The file is imported into a intermidiate table and is then transformed to the required tables. The upload file can be big and might take couple of minutes for processing. I would like to use Thread or Thead Pool to do asynchronous processing. Can you please suggest a good approach to handle this problem.
note: This is an internal application which would be used by max 2-5 person at any given time.
The main problem with firing up additional threads in ASP.NET is that the framework can rip the AppDomain out from under you (for example, if someone edits the web.config or IIS decides to recycle the worker process). If that happens, your worker thread is also terminated and you can't really control it.
If you don't think that'll be a problem, then it doesn't really matter, but I would suggest that a better solution would probably be to fire up the work in a separate process that you can then monitor from your web application.
That way, if someone edits the web config, or IIS recycles the worker process, the import process is running independently and you don't have to worry.
Here is my approach:
Ask the user to paste in the unc path to the file. Save this path into a table in sql.
Write a windows service to check for new entries in the path table. When finding a new entry, start processing the file. Update the tabel periodicly with the progress and check flags (below)
Have an ajax callback in the browser that checks the table for progress, returning as a percentage to the client. Allow the client to stop the process by adding some flags to the table.