Any instance that can delete web.config file - asp.net

We have a webfarm with 2 servers, both has the DFS replication implemented. We have a section on homepage which displays 4 articles known as "TopStories" and are cached so that it does not fetches from database everytimeNow when editors updates the Topstories, that change should get live immediately to the end users across the web-farm, now say editor made the changes to the Topstories on server 1 and the end-user who accessed the website was taken to server 2, so user won't be able to see the latest updates on topstories, as on server 2 it will still fetch from cacheSo for this we recycles the app-pool by updating the web.config file.
We are using below code to make an update in appsettings element of web.config.
public void RefreshWebConfig()
{
//XmlDocument doc = new XmlDocument();
//doc.Load(HttpContext.Current.Server.MapPath("web.config"));
//doc.Save(HttpContext.Current.Server.MapPath("web.config"));
System.Configuration.Configuration config = System.Web.Configuration.WebConfigurationManager.OpenWebConfiguration("~");
System.Configuration.KeyValueConfigurationElement setting = config.AppSettings.Settings["TopStorySessionKey"];
int SessionKey = 0;
if (setting != null)
{
SessionKey = Convert.ToInt32(CommonUtil.GetConfigurationValue<int>("TopStorySessionKey"));
if (SessionKey == 100)
SessionKey = 0;
config.AppSettings.Settings["TopStorySessionKey"].Value = Convert.ToString(SessionKey+1);
}
else
{
config.AppSettings.Settings.Add("TopStorySessionKey", SessionKey.ToString());
}
config.Save();
}
I have 2 questions related to this:
In any instance updating the web.config's app-setting key could delete the whole web.config file using the above code?
Is there any better way to replicate the change on both the servers.
Thanks in advance.

In any instance updating the web.config's app-setting key could delete the whole web.config file using the above code?
I can't see how this code as written would delete the web.config, but it's a rather indirect way to force the app pool to recycle. You can do it directly:
Restarting (Recycling) an Application Pool
Is there any better way to replicate the change on both the servers.
It is rather expensive to recycle the entire app pool just to invalidate a small cache. Instead, consider using a MemoryCache. If you can place the latest articles e.g. a file (whether a single file accessed as a network share, or as a copy of the file on each web server), you can use a file dependency to expire the cache. If the articles are in a SQL database, you can use a SqlChangeMonitor to expire the database. If you can allow the latest articles to be e.g. 2 minutes old, you can use sliding expiration to reload the cache every so often.

Related

Can I use ServerManager from Microsoft.Web.Administration without admin user as an application pool identity

I want to read some settings of the application pool using the ServerManager object from the Microsoft.Web.Administration.dll. The problem is that it works only if the identity of the application pool is a windows user with administrator privileges. Otherwise I am getting UnauthorizedAccessException - Filename: redirection.config; Error: Cannot read configuration file due to insufficient permissions.
Is there any workaround about this issue.
My code is the following:
ServerManager manager = new ServerManager();
string currentSiteName = System.Web.Hosting.HostingEnvironment.SiteName;
Site currentSite = manager.Sites[currentSiteName];
string appVirtaulPath = HttpRuntime.AppDomainAppVirtualPath;
string appPoolName = string.Empty;
foreach (Application app in currentSite.Applications)
{
string appPath = app.Path;
if (appPath == appVirtaulPath)
{
appPoolName = app.ApplicationPoolName;
}
}
ApplicationPool currentAppPool = manager.ApplicationPools[appPoolName];
Thanks!
No, there is no workaround to read the configuration file without causing a big security concern. What are you trying to accomplish?
If reading configuration settings, you can use an API in the same DLL that will give you read-only configuration access for that site settings, such as reading web.config or values in applicationHost.config for that site only, and not encrypted ones (such as passwords). The API is called WebConfigurationManager and has a static method called GetSection, such as WebConfigurationManager.GetSection("system.webServer/defaultDocument")
See: https://msdn.microsoft.com/en-us/library/microsoft.web.administration.webconfigurationmanager.getsection.aspx
However, several settings (namely all the ones used to start the process w3wp.exe) are not possible to be read through that API.
Short story: Unfortunately for security reasons many of those settings are not possible to be read from a worker process. There are some things you can read using server variables such as Request.ServerVariables["APP_POOL_ID"]), or Request.ServerVariables["APP_POOL_CONFIG"]). Of course bitness you could calculate the size of a pointer (4 or 8), or use environment variables (like PROCESSOR_ARCHITECTURE)
Longer story: In IIS for security reasons we take the applicationHost.config file and we split it into smaller application pool.config files (by default located at C:\inetpub\temp\appPools) which are isolated for security reasons so that even if untrusted code were to run in the process (w3wp.exe) to try to steal/read the settings of other sites it would be physically impossible. You can open the file and see which settings are there and you can read those. You'll notice the appPools section is missing entirely since that is only used by WAS to start w3wp.exe.

Slow AppFabric, high CPU and Memory Usage

I implemented AppFabric 1.1 to my ASP.NET web application. I am using Read Through approach because I just need to read images from my SQL database and store them in the cache. So I will have chance to retrieve those data as fast as possible.
I am checking shell and I can see that my application is reading successfully from cache and write to the cache if cache is empty. However, AppFabric is not as fast as I expected. The version without AppFabric is faster than the one with AppFabric. In addition to that, when I use Appfabric, I can see that there is high CPU and memory usage.
What are the potential reasons of that? What do you suggest to me?
Appreciated to your ideas,
So without more details, it's hard to tell for sure, but I can try to help from my experience with AppFabric. Are we talking about high memory usage on the AppFabric server, or the client computer(not sure if you are using a web app, or something else)
AppFabric will be slower than in-proc memory, also AF should not be on the same server as you application.
How are you creating the AppFabric DataCacheFactory? Are you creating for every request? That is bad, as it is expensive, so it should be a static/singleton. I do something like
public class AppFabricDistributedCacheManagerFactory {
private static DataCacheFactory _dataCacheFactory;
public void Initialize()
{
if (_dataCacheFactory == null)
{
_dataCacheFactory = new DataCacheFactory();
}
}
......
Do you have local cache enabled in AppFabric, for images it seems appropriate.
Make sure your Provider is not throwing exceptions and is only calling Appfabric when it really should. Put fiddler on you dev box and watch the requests. So watch for
First call to AF, are you using regions? Make you create it.
If you are creating regions? Do you make it exists before you save? Just in case you are look at this code.. before did this.. I had a few issues
public void SaveToProvider(string key,TimeSpan duration ,string regionName,object toSave)
try
{
Cache.Put(key, toSave, duration , regionName);
}
catch (DataCacheException cacheError)
{
// Look at the ErrorCode property to see if the Region is missing
if (cacheError.ErrorCode == DataCacheErrorCode.RegionDoesNotExist)
{
// Create the Region and retry the Put call
Cache.CreateRegion(_regionName);
Cache.Put(key, toSave, duration , regionName);
}
}
Watch the requests when you request a item not is cache.. see that is call AF then loads the image and call AF again to save.
Watch the request when you know the item in already loaded, if you are using local cache you should see no AF requests..One if you are not.

Using ffmpeg in asp.net

I needed a audio conversion library. After already pulling my hair..I have given up on the fact that there is no such audio library out there..every library out there has some or the other problem.
The only option left is ffmpeg which is the best but unfortunately you cannot use it in asp.net (not directly I mean). Every user on the website that will convert a file; will launch an exe?; I think I will hit the server memory max soon.
Bottom Line: I will try using ffmpeg.exe and see how many users it can support simultaneously.
I went to the ffmpeg website and in the windows download section I found 3 different version; static, shared and dev.
Does any one know which would be the best? All packed in one exe (static) or dll's separely and exe small, wrt using it in asp.net?
PS: any one has a good library out there..would be great if you can share.
Static builds provide one self-contained .exe file for each program (ffmpeg, ffprobe, ffplay).
Shared builds provide each library as a separate .dll file (avcodec, avdevice, avfilter, etc.), and .exe files that depend on those libraries for each program
Dev packages provide the headers and .lib/.dll.a files required to use the .dll files in other programs.
ffMpeg is the best library out there from what I have used but I wouldn't recommend trying to call it directly from asp.net.
What I have done, is accepted the upload, stored it on the server, or S3 in my case, then have a worker role (if using something like Azure) and a process that continuously looks and monitors for new files to convert.
If you needed a realtime like solution, you could update flags in your database and have an AJAX solution to poll the database to keep providing progress updates, then a link to download once the conversion is complete.
Personally my approach would be
Azure Web Roles
Azure Worker Role
ServiceBus
The WorkerRole starts up and is monitoring the ServiceBus Queue for messages.
The ASP.NET site uploads and stores the file in S3 or Azure
The ASP.NET site then records information in your DB if needed and sends a message to the ServiceBus queue.
The WorkerRole picks this up and converts.
AJAX will be needed on the ASP.NET site if you want a realtime monitoring solution. Otherwise you could send an email when complete if needed.
Using a queuing process also helps you with load as when you are under heavy load people just wait a little longer and it doesn't grind everything to a halt. Also you can scale out your worker roles as needed to balance loads, should it ever become too much for one server.
Here is how I run ffMpeg from C# (you will need to change the parameters for your requirements)
String params = string.Format("-i {0} -s 640x360 {1}", input.Path, "C:\\FilePath\\file.mp4");
RunProcess(params);
private string RunProcess(string Parameters)
{
//create a process info
ProcessStartInfo oInfo = new ProcessStartInfo(this._ffExe, Parameters);
oInfo.UseShellExecute = false;
oInfo.CreateNoWindow = true;
oInfo.RedirectStandardOutput = true;
oInfo.RedirectStandardError = true;
//Create the output and streamreader to get the output
string output = null; StreamReader srOutput = null;
//try the process
try
{
//run the process
Process proc = System.Diagnostics.Process.Start(oInfo);
proc.ErrorDataReceived += new DataReceivedEventHandler(proc_ErrorDataReceived);
proc.OutputDataReceived += new DataReceivedEventHandler(proc_OutputDataReceived);
proc.BeginOutputReadLine();
proc.BeginErrorReadLine();
proc.WaitForExit();
proc.Close();
proc.Dispose();
}
catch (Exception)
{
// Capture Error
}
finally
{
//now, if we succeeded, close out the streamreader
if (srOutput != null)
{
srOutput.Close();
srOutput.Dispose();
}
}
return output;
}

Problems with Session in Asp.Net

We have a web farm of IIS 6 servers that runs our application.
Our session is stored on Sql Server 2005 on a diffrent server.
Every couple of months we are getting this error in one of the web server logs:
"Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred beacuse all the pooled connections were is use and max pool size was reached"
Stack trace:
System.Data.ProviderBase.DbConnectionInternal
GetConnection(System.Data.Common.DbConnection)
at
System.Data.ProviderBase.DbConnectionFactory.GetConnnection(DbConnection
owningConnection) at
System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection
outerConnection,DbConnectionFactory
connectionFactory) at
System.Data.SqlClient.SqlConnection.Open()
at
Systme.Web.SessionState.SqlSessionStateStore.SqlStateConnection..ctor(SqlPartitionInfo
sqlPartitionInfo)
When this exception is thrown the server starts to behave strange - some users can access the app and some do not.
The only solution we found so far is to reset IIS on that server.
I should also mantion that the server is not appered to be overloading and the preformence is pretty normal before this happens..
Any ideas?
This is classic of bad resource management.
If you are using a custom Session Manager (module) with SQL then you are not disposing of the connections properly, and the application pool is running out of connections. Then all subsequent connection are waiting for the connection to automatically be disposed; and this is where the timeout occurs.
However, this is probably not your problem, so what you need to do is limit the timeout time of the connections as such:
Timeouts under heavy load
If your web
servers are under heavy load it may be
useful to increase the time out for
session state access. You can add the
stateNetworkTimeout attribute to the
sessionState settings in web.config
and machine.config
If a Web server or a state server
is under stress and cannot complete
session accesses on time, event ID
1072 and event ID 1076 may be logged
in the event log.
http://idunno.org/articles/277.aspx
Furthermore
You should only use SessionState for basic data types like string, int, bool
If you are storing alot of information or complex data types, maybe you need to reasses what and why it is stored there.
You should look into using Cache or Viewstate. There are many such articles on the internet, for example:
http://www.codeproject.com/KB/session/exploresessionandcache.aspx
Since your session state is SQL based, and this is the slowest mode, you should really try to use it as least as possible. Maybe you could store values in the cache with a unique key, and store the unique key in the session variable. Many workarounds exist.
Another more useful link:
http://devshop.wordpress.com/2008/04/10/how-to-choose-from-viewstate-sessionstate-cookies-and-cache/
As your comments became more specific, I also have the following to add. If you create a class like the following:
public class PartitionResolver : System.Web.IPartitionResolver
{
private String[] partitions;
public void Initialize()
{
// create the partition connection string table
// web1, web2
partitions = new String[] { "192.168.1.1", "192.168.1.2" }; // keep adding servers
}
public String ResolvePartition(Object key)
{
String oHost = System.Web.HttpContext.Current.Request.Url.Host.ToLower().Trim();
if (oHost.StartsWith("10.0.0") || oHost.Equals("localhost"))
return "tcpip=127.0.0.1:42424";
String sid = (String)key;
// hash the incoming session ID into
// one of the available partitions
Int32 partitionID = Math.Abs(sid.GetHashCode()) % partitions.Length;
return ("tcpip=" + partitions[partitionID] + ":42424");
}
}
... and then in your web.config you put something like the following:
<sessionState mode="StateServer"
partitionResolverType="NameSpaceName.PartitionResolver"
cookieless="false"
timeout="60" />
... and then follow the instructions:
http://www.c-sharpcorner.com/UploadFile/gopenath/Page107182007032219AM/Page1.aspx
... and create an identical machine key across all your web servers; then you will not require SQL session state, and will have a common session state, which you can load balance across any number of state servers that you may require.
All you would ever have to do is update this line:
partitions = new String[] { "192.168.1.1", "192.168.1.2" }; // keep adding servers
... and you could have multiple web servers using the same state servers, so even if for whatever reason you switch web servers, you will still maintain your session. And also, as you see the session slowing down, just keep adding state servers.

File.Create from IIS locking the created File

I have an ASP.NET running in IIS 7.5 that creates files on the local file system and then attempts to delete after performing some logic in between creation and deletion. I'm running into a situation though where deletion is failing with a response such as "The process cannot access the file 'C:...\Uploads\c1fe593f-85de-4de1-b5d1-7239e1fc0648_Tulips.jpg' because it is being used by another process.'" The file appears to be locked by IIS and I can't delete it. Here's an example of the code for creating and deleteing:
// File.WriteAllBytes(path, rawData); // this seems to leave the file open!
using (var file = File.Create(path))
{
file.Write(rawData, 0, rawData.Length);
file.Close(); // should close when it goes out of scope, but just to be safe
}
Is there some special option I need to pass into File.Create? How do I get around this?
File.WriteAllBytes(path, rawData); should work fine assuming the path parameter you are passing is unique and that you don't have concurrent requests one writing and other trying to read at the same time. To ensure this you could use a ReaderWriterLockSlim to synchronize the access if this situation could potentially occur. Also make sure that there are no other parts of the code that might leak the file handle.
Take a look at SysInternals Process Explorer which could allow you to know exactly where this file handle is leaked.

Resources