I'm building a website (for personal use, low load) and instead of using an Access or MySQL database for data storage I'm thinking of having one XML file that I load and parse on Application_Start and then keep in memory (in static objects). The website then do reads and writes against these in-memory objects and I will finally persist all data to the XML file on Application_Disposed.
I'm aware that I'll need to make reading/writing thread-safe, but besides that, does anyone see any problem using this approach?
Yes, I see a big problem: There are a number of reasons to why the whole application might die without you knowing about it, and without your data being saved to that xml file.
You'll find Application_Dispose can get fired multiple times (so might don't be the best place to dispose your DI containers etc) whereas Application_End will only fires once (you can prove this by adding logging)
https://bytes.com/topic/asp-net/answers/561768-event-sequence
https://learn.microsoft.com/en-us/previous-versions/ms178473(v=vs.140)?redirectedfrom=MSDN
It seems VS2019 IIS Express doesn't seem to call Application_End as it should when you stop debugging, But IIS will.
Related
Is it possible in any way to edit an excel sheet through an ASP.net page that contains macro. I have tried to open the Excel sheet and it seems to just hang rather than load the excel. Testing on a page without macros works perfectly fine?
Disclaimer: I don't know the Excel license agreement and I don't know if utilizing Excel in a server process violates it or not. This is purely a technical description of how to get it working. The reader is advised to check the license agreement to see if it's allowed to do so or not. Different Office versions may have different license agreements. I used this method at several Fortune 100/500 companies and they didn't seem to care. Go figure.
This solution works but it has some limitations and require a fair amount of control over the server where it runs. The server also needs to have lots of memory.
To start, make sure that you perform a complete installation of every single Office feature on the server so that Excel won't try to install something if you attempt to use a feature that's not present.
You also need to create a dedicated user account on the server that has the right privileges. I can't tell you what exactly because in my case we controlled the server and we gave admin rights to this user.
When you have the user account, you need to log in as that user and run Excel (preferably all Office applications) at least once so that it can create its settings.
You also need to configure Excel to run under this user account when it's created as a COM object. For this, you need to go into DCOM Config on the server and configure Launch and Activation Permissions for the Excel.Application object to use your new user account. I'm not sure if I remember correctly, but I think after this step, running Excel as an interactive user was slightly problematic.
By default, Office applications try to display various messages on the screen: warnings, questions, etc. These must be turned off because when you utilize an Office application from a web application, it runs on the server so a human user won't be around to dismiss these messages - the Office program will just sit around indefinitely, waiting for the message to be dismissed.
You need to set (at the minimum) these properties:
DisplayAlerts = false
AskToUpdateLinks = false
AlertBeforeOverwriting = false
Interactive = false
Visible = false
FeatureInstall = 0 'msoFeatureInstallNone
to disable UI messages from Excel. If you use Excel 2010, there may be more, but I'm not familiar with that.
If you have Excel files with macros in them, you may have to disable macro security in Excel - that can't be done programmatically, for obvious reasons.
To access Excel services, implement a manager object that will actually hold the Excel reference - don't try to hold the Excel.Application object in the page because your page code will become very complicated and you may not be able to properly clean things up.
The object that holds the Excel reference may be a separate DLL or an out-of-process server. You must make sure, however, that when you acquire an instance of Excel on a given thread you always create a new Excel instance. The default is that an already running Excel instance will also serve other requests but that won't work for you because the same Excel instance cannot be shared among multiple threads. Each request-processing thread in IIS must have its own Excel instance - if you share instances, you'll have all kinds of problems. This means that your server will need to have quite a bit of memory to have many instances of Excel running. This was not an issue for me becasue we controlled the server.
If you can, try to create an out-of-proc (.exe) COM server because this way you can hold the Excel reference in a separate process. It's possible to get it working using an in-proc (.dll) COM object but it'll be more risky to your application pool - if Excel crashes, it'll crash your app pool as well.
When you have an .exe server, you can pass parameters in several possible ways:
Make your manager objet a COM object and pass parameters as properties.
Pass parameters as command-line parameteres to the .exe as it starts up.
Pass parameters in a text/binary file; pass the name of the file on the command-line.
I used all these and found the COM object option the cleanest.
In your manager object, follow these guidelines:
Wrap every single function that uses Excel in a try..catch block to capture any possible exception.
Always explicitly release all Excel objects by calling Marshal.ReleaseComObject() and then setting your variables to null as soon as you don't need them. Always release these objects in a finally block to make sure that a failed Excel method call won't result in a dangling COM object.
If you try to use any formatting features in Excel (page header, margins, etc.) you must have a printer installed and accessible to the user account that you use to run Excel. If you don't have an active printer (preferably attached to the server), formatting-related features may not work.
When an error happens, close the Excel instance that you're using. It's not likely that you can recover from Excel-related errors and the longer you keep the instance, the longer it uses resources.
When you quit Excel, make sure that you guard that code against recursive calls - if your exception handlers try to shut down Excel while your code is already in the process of shutting down Excel, you'll end up with a dead Excel instance.
Call GC.Collect() and GC.WaitForPendingFinalizers() right after calling the Application.Quit() method to make sure that the .NET Framework releases all Excel COM objects immediately.
Edit: John Saunders may have a point regarding the license agreement - I can't advise about that. The projects that I did using Word/Excel were all intranet applications at large clients and the requirement to use Word/Excel was a given.
The link he provided also has some tools that may be useful, although those libraries won't have full Excel functionality and if that's what you need, you don't have a choice. If you don't need full Excel functionality, check out those libraries - they may be much simpler to use.
A few links that may be useful to people trying to implement this approach:
StackOverflow question
Possible alternate products
COM server activation and window stations
The story changed a little while ago, with HPC Services for Excel.
With that, you can do Office Automation on a web server. I'm still trying to determine how it fits my situation, but you may want to check it out.
I have a .jpg file which represents the current image from a webcam. User's will be downloading this file at an interval of once a second. Because there could be dozens of users reading it, this could be dozens of times a second (which is normal for any web server).
Problem is, this image is updated by a 3rd party application also once a second which "spiders" my local networks webcam portal image. This is so we can build our webcams into our current administration panel.
The problem I am already finding is ASP.net sometimes gets an error it can not access the file because it is open for write permissions by the bot. Likewise, the bot can not access it because IIS is feeding it to the user.
The bot uses io.streamwriter to save the data to the file, and my script uses Response.WriteFile to send the file to the script. (I need to use an actual ASP.net page with a JPG content-type that feeds the file to make sure only users with a active session can view the JPG).
My question is what is the best practices for this? I know why it's happening but what is the best resolution for this? Would storing as a BLOB in a database maybe be smarter since databases are created for concurrent read/writing already? Is there an easier way of doing this with a file I have not thought of yet?
Thanks in advance,
Anthony Greco
Using a BLOB will work if the readers use SNAPSHOT isolation model (SQL Server 2005 and up). See Download and Upload images from SQL Server via ASP.Net MVC for how to stream an image from a BLOB, and see Understanding Row Versioning-Based Isolation Levels for a lecture on SNAPSHOT.
But using a BLOB may be overkill, you could get away with something much simpler. For instance, if you only have one ASP.Net process, then you could have a global volatile variable for the current file name. The writer writes the JPG into a new file, and then updates the global 'current' file name with an Interlocked.CompareExchange operation (it has to be Compare because a newer writer might actually finish faster, outrun a previous writer, and you want to preserve the latest update). There are still some issues left to solve (find out the file name at startup, clean up old files etc) but they are all fairly ease to solve.
If you have a farm of servers, or multiple ASP.Net processes serving the site, then things could get complicated. I would still do a rotating file name and do a try-and-error approach (try to respond with newest file, fall back to previous older one if conflict is detected).
You could get the bot to write the data to a different filename and then do a delete and rename to the filename being served by ASP.Net. This should reduce the file lock time down to the time for a delete and rename to occur. To clarify:
ASP.Net serving image from "webcam.jpg"
bot writes image data to "temp.jpg"
when last image byte written, bot deletes "webcam.jpg" and renames "temp.jpg" to "webcam.jpg"
ASP.Net should check "webcam.jpg" exists, if not wait 10ms (or suitable small increment) and check again.
I am developing a component to create bespoke BulkImport functionality in ASP.NET. Underline this component will be using SqlBulkCopy class. There will be different file formats. The file is imported into a intermidiate table and is then transformed to the required tables. The upload file can be big and might take couple of minutes for processing. I would like to use Thread or Thead Pool to do asynchronous processing. Can you please suggest a good approach to handle this problem.
note: This is an internal application which would be used by max 2-5 person at any given time.
The main problem with firing up additional threads in ASP.NET is that the framework can rip the AppDomain out from under you (for example, if someone edits the web.config or IIS decides to recycle the worker process). If that happens, your worker thread is also terminated and you can't really control it.
If you don't think that'll be a problem, then it doesn't really matter, but I would suggest that a better solution would probably be to fire up the work in a separate process that you can then monitor from your web application.
That way, if someone edits the web config, or IIS recycles the worker process, the import process is running independently and you don't have to worry.
Here is my approach:
Ask the user to paste in the unc path to the file. Save this path into a table in sql.
Write a windows service to check for new entries in the path table. When finding a new entry, start processing the file. Update the tabel periodicly with the progress and check flags (below)
Have an ajax callback in the browser that checks the table for progress, returning as a percentage to the client. Allow the client to stop the process by adding some flags to the table.
I have an ASP.NET project which is a front-end to a database. In addition to the large tables, the DB contains a few small tables to help normalize the larger tables with common values. I have a VB.NET project which loads the smaller tables into memory, using "Shared" (i.e., "static" in C#) member variables, and uses them. I have a call to load the tables in Global.asax - Application_Start. This works for a while. That is, Application_Start runs when I first run my project, loads the cached values, and will correctly keep them in memory for a while.
What I'm seeing (when running my project via Visual Studio 2008 Debugger, hosted locally) is:
A) The Application_Start code will run more than once. Not in a row, but after the user has navigated to some other pages, I'll see (my breakpoint in) another call to initialize the cache, coming form Application_Start. Is it expected?
B) The "Shared" variable that was set to True when the cache was initialized is now False again (which should only happen when the class is first loaded). Similarly, all the data that was chached is no longer present. That is, it looks like VB is unloading all the Shared members. Is this expected?
If these are the expected behaviors, is there a way to do what I want? The code is in a module that is also used by other (non-ASP.NET) projects, and seems to work correctly for them. I'd rather not have to duplicate this functionality for something specific to ASP.NET, but would like to know what my options are. Thanks for any advice.
Here is an article you might find helpful about Caching Data at Application Startup. It sounds like you are doing everything right, but Application_Start should only get called once, unless some external change happens that restarts the app pool, but in this case i would think you would get detached from the debugger (assuming you are attached to the app pool process for your asp.net application).
I'm being asked to look into a problem that occurs intermittently on a WebServer running my team's application.
Essentially, we have a webservice that does a lookup between codes. If you have Code Type A, you can use it to look up the corresponding Code Type B. Periodically, when memory is running low, when this webservice is called, a null reference exception is being thrown. Essentially, this service loads a lookup file into cache with a dependency on the file, so if the file chages, the cache is reloaded with the new file. The priority on the cache object is set to default. I'm guessing that somewhere in the code, it isn't being verified that the cache object is still there and when memory on the server gets low, that object is dumped causing the error. I'd like to be able to recreate the error and verify before I start digging into this code.
Is there a way in IIS manager (or from the command prompt) to force a running web app to dump it's cache? I would think that this should recreate the condition and therefore recreate the bug. Not to mention, seeing the detail error should lead to the right section of code.
Thanks,
Steve Brouillard
My gut reaction would be to set the WebMethod's CacheDuration to zero, then back to whatever you want on an ongoing basis. I haven't tried this, but I think this would dump the cache then start it forming again...
I found a utility that can be added to ASP.NET apps that will allow you to dynamically manage the cache as a whole or individual cache objects. Thanks to .NET Rocks! and dnrtv.
Here's a link to the tool that I used. This allowed me to clear just the specific objects in question, on the fly, and prove the error.
Thanks to everyone for your help. ASP Alliance Cache Manager.
Steve