How do I update a shared file on IIS and ASP.NET? - asp.net

We're replacing an "old" web application (the original CodeCentral was written over 10 years ago, now running at http://cc.embarcadero.com with over 12.7 million served) that uses a file based cache for serving downloads. The web application manages this file cache.
The only complicated bit of using file-based downloads is updating a file that might be in use by another version. Back when I wrote the the original Delphi CGI version of CodeCentral, I vaguely recall being able to rename the existing file even though someone might be downloading it, writing the new file with the correct name, and eventually cleaning up the old file.
However, I may be delusional, because that's not working with our current IIS 6 and ASP.NET 2.x or 3.x code. So, what we've done instead is this, which is quite kludgy but eventually works:
public static void FileCreate(Stream AStream, string AFileName, int AWaitInterval)
{
DateTime until = DateTime.Now.AddSeconds(AWaitInterval);
bool written = false;
while (!written && (until > DateTime.Now))
{
try
{
using (FileStream fs = new FileStream(AFileName, FileMode.Create))
{
CopyStream(AStream, fs);
written = true;
}
}
catch (IOException)
{
Thread.Sleep(500);
}
}
}
I'm thinking we're missing something simple and there's got to be a better way to replace a file currently being downloaded via IIS and ASP.NET. Is there?

I think we're going to resolve this with the following logic:
We store the date/time a file is updated. If we're unable to update the data-driven file name "[id].zip", we'll create "[id]_[modified].zip" instead, and use that until we're able to rename it to "[id].zip" by deleting the older version of the file.
That way, we're always assured of the latest file by first looking for "[id]_[modified].zip" before looking for "[id].zip", and we can easily implement a clean-up routine both on demand and in the background even if there are multiple modified versions before we can finally replace the "standard" "[id].zip" version.
Of course, this only works if you store the last time the file was modified, but any system using this type of technique should anyway.

John, Could you shed a bit more light on the situatuation?
If the files are small enough you could load the entire file into memory then send it to the client.
Maybe this blog post about downloading and deleting temporary files will help a little.

Related

Handling development time web.config conflicts

I am looking for a way to handle this challenge: we are a geographically dispersed dev team using ASP.NET Web API and Angular to build a web app.
The thing that causes the grief is the fact that not all team members use the same database setup for their dev work. Yes, I know - I can use web.config transforms to set the proper connection strings for test, staging and production (and I'm already doing this) - but this is not what I'm talking about.
Due to reasons beyond our control at this time, we have
some developers working on a local SQL Server instance using server=(local);database=OurDB as their connection string
other developers using a central developer SQL Server in their location, using something like server=someserver.mycorp.com;database=OurDB
and a few exotic cases with yet other settings
Now every time someone commits a change to the Git repo, and happens to also change something in the web.config, his connection string is committed to the repo. So when I then go pull that latest commit, my settings to my local DB server are overwritten by this other guy's settings.
I am looking for a way to handle this - I was hoping I might be able to
hook into the Git pull process and automagically update the web.config connection string to my local needs whenever I pull something
somehow reference a connection string (or external config file) based on e.g. my currently logged in user's name or something like that
But I can't seem to find any way of doing this. I was wondering if I need to build a VS extension to handle this - any starters for that? Has anyone done something like this before and could share his code? (or has it up on Github)
The web.config configuration system used in ASP.NET is not flexible enough to support the more advanced scenario you have described. So, why use it? You could store the configuration in files within the repository, one per developer. Or they could be stored outside the repository or otherwise ignored.
The real trick is that most older applications don't have a single root that retrieve the configuration, so you have to refactor your application to utilize a flexible configuration system. For your staging/production environments you probably still want to use the config in web.config. The following code can give you a basic idea of one way to structure it:
public class MyApplicationConfiguration
{
public string MainConnectionString { get; set; }
}
public class ConfigurationRetriever
{
public MyApplicationConfiguration GetConfiguration()
{
// You might look for the absence or presence of an environment variable to determine this
bool isLocalDevelopment = IsApplicationLocalDevelopment();
var config = new MyApplicationConfiguration();
if(isLocalDevelopment)
{
config.MainConnectionString = Environment.GetEnvironmentVariable("MyApplication_MainConnectionString");
//or get it from a JSON file or XML file or config database
}
else
{
config.MainConnectionString = ConfigurationManager.ConnectionStrings["MainConnectionString"].ConnectionString;
}
}
}
Rather than rolling your own config building logic, you might refactor your application to leverage Microsoft.Extensions.Configuration. It's not just for .NET Core. It's for .NET Standard. So you can use it even in your legacy ASP.NET applications. For reading the web.config, you could probably use Microsoft.Extensions.Configuration.Xml. Or you can write your own adapter that pulls values out of ConfigurationManager. I did a basic test, and this worked as expected.

Enterprise Library not logging in setup project

I need your opinion on this: Is it possible to use enterprise library logging dll in the setup project?
Here's what I did:
I created a setup project which will call a windows form to install the database. When I installed the project, it did call the windows form. However, when I click on the "Install" button, it seems that there's a problem and I don't know where it is. Then another popup message is displayed which said that it cannot locate the logging configuration.
But the config file for the windows form is there which includes the configuration for the logging dll. I don't have any idea where to look into.
Please help me with this?
Below is the error message:
UPDATE
I observed that when I run the exe file as is, the enterprise library logging config works. But with the setup project, it does not look for it. Any help on this?
Below is the code for this:
[RunInstaller(true)]
public partial class IPWInstaller : Installer
{
public IPWInstaller()
{
InitializeComponent();
}
public override void Install(IDictionary stateSaver)
{
base.Install(stateSaver);
string targetPath = Context.Parameters["TargetDir"];
InstallDatabase db = new InstallDatabase(targetPath);
DialogResult dbResult = db.ShowDialog();
if (dbResult != DialogResult.OK)
{
throw new InstallException("Database is not installed.");
}
ConfigureFiles config = new ConfigureFiles(targetPath);
DialogResult configResult = config.ShowDialog();
if (configResult != DialogResult.OK)
{
throw new InstallException("Config files are not saved correctly.");
}
}
}
LATEST UPDATE:
I tried to set the value of a certain configuration to my messagebox. This is the result of it when I run the install project.
Is there a way to call my app.config in the setup project
There are at least a couple of things that can go wrong.
The app is not running as it would if you ran it as an interactive user. It is being called from an msiexec.exe process that knows nothing about your intended environment, such as working directory. None of the automatic things that happen because you run from an explorer shell will happen. Any paths you use need to be full and explicit. I think you may need to explicitly load your settings file.
Something else that can happen in a per machine install is that custom actions run with the system account so any code which assumes you have access to databases, user profile items like folders can fail.
Another problem is that Windows Forms often don't work well when called from a VS custom action. It's not something that works very well because that environment is not the STA threading model that is required for window messages etc.
In general it's better to run these config programs after the install the first time the app starts because then you are in a normal environment, debugging and testing is straightforward, and if the db gets lost the user could run the program again to recreate it instead of uninstalling and reinstalling the setup.

Uploading big Files

I am an amateur ASP.net developer working on my first job (a friends website). ASP.net v4.0, using VS2010.
His company makes 3D models (using a 3D printer). The website is currently in development but can be found here. I would be the first to admit that the code has been a bit rushed and hacked in, but my friend is quite happy with what he has so far.
One of the requirements is that his customers need to be able to upload their model design files, which could be up to 100 MB each (or maybe more). I am struggling to get this to work properly.
I started by using the built in <asp:FileUpload ID="FileUpload1" runat="server" /> tag and an animated gif image, similar to the idea described in Joe Stagner's tutorial - thanks Joe, I like your presentations a lot.
This worked ok for a small test file but doesn't give any indication of upload progress. So I tried to improve my solution using ideas developed from Sunasara Imdadhusen in his code project article. My upload code looks like this:
Task t = Task.Factory.StartNew(() =>
{
byte[] buffer = new byte[UPLOAD_BUFFER_BYTE_SIZE];
// Upload the file in chunks so that we can measure how long it is taking.
using (FileStream fs = new FileStream(Path.Combine(newQuotePath, filename), FileMode.Create))
{
DateTime stopwatch = DateTime.Now;
while (stats.Uploaded < stats.TotalSize)
{
int bytecount = postedFile.InputStream.Read(buffer, 0, UPLOAD_BUFFER_BYTE_SIZE);
fs.Write(buffer, 0, bytecount);
stats.Uploaded += bytecount;
double dRate = UPLOAD_BUFFER_BYTE_SIZE / Math.Abs((DateTime.Now - stopwatch).TotalSeconds);
stats.Rate = (int)(Math.Min(dRate, int.MaxValue));
// Sleep is for debugging only!
//System.Threading.Thread.Sleep(2000);
stopwatch = DateTime.Now;
}
}
}, TaskCreationOptions.LongRunning);
Where stats is a reference to a class that is stored as a session variable, and is accessed from a javascript function running in a setInterval(...) (while the upload is taking place) by the PageMethod:
[System.Web.Services.WebMethod]
[System.Web.Script.Services.ScriptMethod]
public static UploadStatus GetFileUploadStatus()
{
UploadStatus stats = (UploadStatus)HttpContext.Current.Session["UploadFileStatus"];
if ((stats != null) && (stats.IsReady))
{
return stats;
}
else
{
return null;
}
}
This worked on my local machine (using the thread sleep to slow down the upload). So I published it to our host 123-Reg and it didn't work as expected. The animated gif comes on when the upload starts, but the progress bar doesn't start moving. The file input control and the submit button (in an IFrame) that should get disabled as soon as the upload starts take ages to become disabled. Then the web page just hangs there. After waiting for a while I clicked the page refresh button. When the page refreshed it showed that my test file had been uploaded successfully. I tried a 16 KB file and a 1.6 MB file.
Since this worked on my local machine, I suspect that this is happening because our website host (123-Reg) is using a web farm.
Anyway I thought this was going to be easy but it is not, and so I had a look for some open-source upload progress bars. I had a look at NeatUpload but it says "By default, NeatUpload won't work properly on a web garden or web farm", and it also says to get it to work on a web farm "Specify the same random 32-hex-digit decryptionKey attribute in the section of each server's Web.config". But I don't think I have access to 123-Reg's server config files(?).
My friend has suggested that perhaps we could use a service such as dropbox, but I had a look and I have no idea how to add this to our website. This might be a good option since they might be optimized for uploads.
Any advice or suggestions would be very much appreciated - thanks.
EDIT: The story so far ... still struggling with this.
I looked into using dropbox in detail (and other cloud storage providers such as SkyDrive etc). Seems that these services are designed around providing applications that interact with individual user's storage. I wanted all users to be able to upload files to MY dropbox folder but without sharing (customers should not have access to other customer's design files). Anyway I carried on, setup a dropbox account, installed the Sharpbox SDK and reprogrammed my upload code (the bit inside the task action). This seemed to work ok on my local machine, it was a bit slower because it was uploading to the dropbox server (I didn't need the Thread.Sleep). I published the website to the 123-Reg server and got a similar, unusable experience as before.
So far I had been testing on IE9, and I just happend to try it out with Chrome and I noticed a funny thing. Just after clicking the upload button but before the page refreshed Chrome showed an upload % complete dialog in the bottom left. This ran to 100%, then my controls started to show some action before the whole page started to hang again.
According to MSDN the HttpPostedFile:
"By default, all requests, including form fields and uploaded files, larger than 256 KB are buffered to disk"
So does this mean that my Task is being started after the painful part of waiting for the upload (which is actually happening between the client and a buffer)? If this is the case then it would not make sense to make it more painful by then sending the file off to dropbox right? And my progress bar is tracking the progress of the wrong bit?
(Today I am going to check the error handling of my code since I have a feeling that the reason it is hanging is because it is not recovering from an exception properly). It certainly feels like I am learning a lot by doing this.
This might be a bit overkill but you could look into using BluImp's jQuery File Upload tool (Demo site here: http://blueimp.github.com/jQuery-File-Upload/ ) - a developer called Max Pavlov has modified the original version (originally for non-dotnet technologies) for use in MVC 3.
It can be found here on git hub https://github.com/maxpavlov/jQuery-File-Upload.MVC3 . I have successfully implemented this in both MVC 3 and MVC 4 Beta. The only thing I had to do to make it work effectively in MVC 4 is remove some of the ClientDependancy (This is a DLL that handles bundling and minification of JS and CSS files) code as this replicates functionality already in MVC 4 but not in MVC 3. Additionally I have added some pages to the wiki over on GitHub describing what I did although its not complete. If you decide to go down this route I could update some details there with my more recent findings. My notes so far on MVC 4 integration can be found here https://github.com/maxpavlov/jQuery-File-Upload.MVC3/wiki/MVC-4---EnableDefaultBundles .
Incidentally I have managed to upload files up to about 2Gb using this tool and have found it quite flexible!

EF 4.1 Code First Initialization - Alternative

I am building a web application using the entity framework and the code first approach and I really like it so far except one thing. The initialization process and seeding data is crap.
I have set it up as recommended with ASP.NET MVC with the setinitialiser being called in app start and a custom initialization class to add data but it always seems to fail silently and never work. (The database creation works just the data init fails)
Can anyone provide recommended paractice for this or a way to run an sql script from a file.
The given method for adding data, especially for a demo site seems cumbersome and I would prefer the ability to just run a database script directly from a file that is run once as part of an install process rather than depending on a process that fails without any indication that something has gone wrong.
EDIT
I have noticed it throwing exceptions ( idiotic datetime -> datetime2 conversion errors that should be handled by the entity framework.)
But part of the problem may be that my version of express 2010 is not breaking on errors it seems to be very buggy when debugging.
But the issue still stands. I find it a cumbersome and buggy way of essentially running sql scripts on the database. And don't want to end up with a huge set of methods and classes just to setup a demo site when someone installs my web application in IIS.
If you want to run SQL scripts from your initializer I would recommend adding
using Microsoft.SqlServer.Management.Smo;
using Microsoft.SqlServer.Management.Common;
string scriptDirectory = HttpContext.Current.Server.MapPath("~/SqlScripts");
string sqlConnectionString = context.Database.Connection.ConnectionString;
DirectoryInfo di = new DirectoryInfo(scriptDirectory);
FileInfo[] rgFiles = di.GetFiles("*.sql");
foreach (FileInfo fi in rgFiles)
{
FileInfo fileInfo = new FileInfo(fi.FullName);
using (TextReader reader = new StreamReader(fi.FullName))
{
using (SqlConnection connection = new SqlConnection(context.Database.Connection.ConnectionString))
{
Server server = new Server(new ServerConnection(connection));
server.ConnectionContext.ExecuteNonQuery(reader.ReadToEnd());
}
reader.Close();
reader.Dispose();
}
}
The reason for using the SqlServer Management Objects is that you can use "GO" in your scripts. it then becomes incredibly easy to script from SSMS and paste the scripts into your SqlScripts directory.
You can find the SMO Libraries at:
C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies\Microsoft.SqlServer.ConnectionInfo.dll
C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies\Microsoft.SqlServer.Management.Sdk.Sfc.dll
C:\Program Files\Microsoft SQL Server\100\SDK\Assemblies\Microsoft.SqlServer.Smo.dll
and if you need help scripting your data
SP_Generate_Inserts
Until you will show reproducible code snippet where initialization fails without throwing an exception I hardly believe that this happens.
You can always execute any SQL script by falling back to classic ADO.NET with SqlConnection and SqlCommand. Just open the file, load commands into string and execute them with SqlCommand or Database.ExecuteSqlCommand.

Can NHibernate check if the db schema has been generated?

So, newbie NHibernate user; trying to wrap my brain around it.
I'm contemplating how to handle deployment, and later injection of add-ons to a web app (which may require their own persistence classes).
I was thinking that using SchemaExport for the deployment would work pretty well, but I was wondering if there's a way too get NHibernate to tell me in a common, code-based way that a schema export has been done already, or not. Basically, I want to do smething like in this pseudocode:
if(!_cfg.HasSchemaForType(typeof(MyType))
ExportSchema(typeof(MyType));
else
UpdateSchema(typeof(MyType));
where the two functions would internally use SchemaExport or SchemaUpdate, respectively.
EDIT: Guys, I appreciate the answer so far, but they're missing the point a bit. What I'm trying to set up is a way for the application to allow for the addition and removal of add-ons which may require changes to the db. I'm not talking about versioning my own code or the like (at least, not as its primary function). So the question is less about when I deploy the app, and more about when I add or remove a plug-in. Has theis plugin (hence the pseudo-code type check) been deployed before? If so, run the update. If not, run the export. Make sense?
I think that what you are looking for is SchemaUpdate.Execute instead of using SchemaExport. SchemaUpdate will create the schema if it doesn't already exist, or update it if required and desired.
That works for me using both MSSQL and SQLite.
new SchemaUpdate(config).Execute(false, true);
Yes there is, in 3.0 at least
public static bool ValidateSchema()
{
NHibernate.Tool.hbm2ddl.SchemaValidator myvalidator = new NHibernate.Tool.hbm2ddl.SchemaValidator(m_cfg);
try
{
myvalidator.Validate();
myvalidator = null;
return true;
}
catch (Exception ex)
{
MsgBox(ex.Message, "Schema validation error");
}
finally
{
myvalidator = null;
}
return false;
}
For the update part, do.
public static void UpdateSchema()
{
NHibernate.Tool.hbm2ddl.SchemaUpdate schema = new NHibernate.Tool.hbm2ddl.SchemaUpdate(m_cfg);
schema.Execute(false, true);
schema = null;
} // UpdateSchema
No, NHibernate doesn't do what you're asking. I imagine it would be possible to write some code that exported the schema and then compared it to the database schema. But it would probably be easier to export into a temporary database and use a 3rd party tool, such as redgate SQL Compare, to compare the schemas.
Even if it did what you're asking, I don't see how that would help with deployment because its purpose is to create a database from scratch.
Edited to add: Assuming each plugin has its own set of tables, you could determine if the schema has been deployed using one of several methods:
Attempt to load one of the plugin objects and catch the exception.
Examine the database schema (using SMO for SQL Server) to check if the table(s) exist.
Create a record in a table when a plugin is deployed.
The purpose of schema export is to generate the complete schema from scratch. Really useful if you haven't deployed your application yet.
After the first deployment I would highly recommend using a migrations tool which will help you with further extensions/modifications of the schema. If you think a bit more ahead you will notice that you even require data manipulation (e.g. removing wrong data which has been generated due to a bug) as your application evolves. That's all a migration tool can help you with.
Take a look into:
Migrator.net
Here is a list of more migration tools for .net answered in a SO question:
.net migrations engine
The original idea of migrations originated from Ruby on Rails and has been "cloned" into other frameworks over the past. That's why it's definitely good to read about the original idea at http://guides.rubyonrails.org/migrations.html too.
If you have VS Team Suite or the Database Developer edition, it can sync and track changes and then make a deployment script that will create all the right objects for you. Also RedGate has a Schema Compare product that does the same thing if I'm not mistaken.

Resources