I have a web application (MainApplication) where many of the pages contain a custom Web Control that looks for some content in a cache. If it can't find any data within the cache, then it goes out to a database for the content. After retrieving the content, the Control displays the content on the page.
There is a web application (CMS) in a subdirectory within the aforementioned web application. Users use this CMS to update the content pulled in by the MainApplication.
When a user updates some content using the CMS, I need the CMS to clear the relevant portion of the cache used by the MainApplication. The problem is that, as two different web applications, they can't simply interact with the same static cache object.
The ideal solution would be to somehow share an instance of a cache object between both web applications.
Failing that, what would be the best (performance-wise) way of communicating between the two web applications? Obviously, writing/reading to a database would defeat the purpose. I was thinking about a flat file?
Update
Thank you all for your help. Your wonderful answers actually gave me the right search terms to discover that this was a duplicate question (sorry!): Cache invalidation between two web applications
We had the exact same setup in a previous project i worked on, where we had one ASP.NET Web Application (with MCMS Backing), and another ASP.NET Web Application to display data.
Completely different servers (same domain though).
However, when a "editor" updated content in the CMS application, the UI was automatically refreshed.
How? Glad you asked.
We stored the content in SQL Server, and used Replication. :)
The "frontend" Web Application would read the data from the database (which was replicated by the CMS system).
Now - we don't cache this data, because in the database, we actually stored the markup (the HTML) for the control. Therefore we dynamically re-rendered the HTML.
Why is that "defeating the purpose"?
You can't get one application to "invalidate" the cache on another application.
If you're going down this path, you need to consider a distributed caching engine (e.g Velocity).
One option that comes to my mind in such scenario is using Velocity distributed cache mechanism. Do read about it and give it a try if possible http://msdn.microsoft.com/en-us/magazine/dd861287.aspx
In ASP.NET there is the notion of Cache Dependency. You can have a look here: http://www.codeproject.com/KB/web-cache/CachingDependencies.aspx or http://www.devx.com/dotnet/Article/27865/0/page/5.
There is also the Enterprise Library Caching Block available here that adds some feature to the standard stuff: http://msdn.microsoft.com/en-us/library/ff649093.aspx
Now, if you're running on .NET 4, there is a new System.Runtime.Caching namespace that you should definitely use: http://msdn.microsoft.com/en-us/library/system.runtime.caching.aspx
This article here "Caching in ASP.NET with the SqlCacheDependency Class" is quite interesting: http://msdn.microsoft.com/en-us/library/ms178604.aspx
Related
i have just went through this article to create a web application with multiple web application.
https://support.microsoft.com/en-us/help/307467/how-to-create-an-asp-net-application-from-multiple-projects-for-team-d
my requirement exactly match this.i have a large web application which i have to deliver in multiple phases and when deploying the changes of any child project,it should not affect the existing running child project or main project.i should be able to use the use control or dll between the child projects.
i need a sample of this approach. i have tried to create the same but the sharing of user control etc. between the child projects is not working.i think ,i am doing something wrong. if anyone have a sample or example of this approach then please share.
i am working on asp.net web form application not MVC.
Presuming you are using source control effectively, this should not present any problems. You can add a child project at any time and keep it checked out to yourself. When you are satisfied with unit testing you can perform integration testing before putting it live.
The only issue I see with your description of the problem lies with attempting to share User Controls between projects. People have been experiencing problems with this approach for a long time, especially with the Web Site Template. It apparently is possible for Web Forms projects:
How do I share user controls between web applications in ASP.NET?
Creating and Using User Control Libraries
Personally, I think it depends on what you are trying to to with the User Controls. For example, are they just displaying something that is repeated on different pages? In that scenario, make more judicious use of Master Pages. If they are being used for functionality, then consider the creation of a library of Custom Server Controls and reference these in your projects?
my next assignments is to build 2 information portals for customers. These portals will be login protected sites and contain a set of pages displaying information like orders, invoices, pdf-files ... for the authenticated user (all presented as lists with links to detail pages). The users and the data are stored in an Oracle database. The portals differ in some of the features and in the layout.
My standard approach is to build an individual ASP.net Web Application for every portal.
But this is not the best way to get something reusable. So for these two projects my idea is to create a set of WCF services to get the Data from the Oracle database and to build user controls to display the different elements in Umbraco. This way I hope to get a set of independent, reusable “modules” which can be used to build these portals.
Now my question: is Umbraco a good platform for this type of projects? And is my “concept” a valid approach?
Kind regards
Volkmar
Umbracois very flexible. ON the one hand there is the question about security: With Umbraco you can use any Membership Provider you want for all visitors ( also with member roles).
On the other hand you have the question of the integration: With Umbraco you can create usercontrols, xslts or razor files as macros (which can be seen as the reusable modules).
For Xslt you can implement your own XsltExtension which pulls the external content as XPathNodeIterator you can use in every Xslt macro. For ascx files or razor you can use LinQ2Umbraco, your own objects etc to connect to the oracle database.
You also can use some sort of caching functionality to reduce the db-calls. On the other hand is one of the biggest advantages that Umbraco stores all the content as xml and object tree in memmory. So it is very fast in content rendering. With every database call you are loosing a little bit of this advantage.
hth, Thomas
Ruben Verbourgh began the Oracle4Umbraco project to create an abstracted fork for the Datalayer to support running on an Oracle DB. You can find it at http://oracle4umbraco.codeplex.com/, although it has no active releases, so build from source and YMMV.
Volkmar, your concept is perfectly sound - although you might want to consider using the Umbraco data store as the persistence layer for your data rather than in the Oracle DB itself. You get XML content versioning, caching, and all the benefits of the content-management side of things, in a robust and flexible framework which you can expose to other apps later should you so need to, through the Umbraco APIs and web services.
HTH,
Benjamin
content management of website becomes simplified with Umbraco.
But if you are planning to use Oracle as backend, Umbraco does not have support for it.
So decide carefully as to what parameters can be compromised.
Good luck.
I'm creating a web application using asp.net & WCF as 3 tier architecture, which is mostly looks like a social website. Users can register with the system and they can upload their profile images, documents, video clips etc. So, what i want to know is what is the best way to store those files? In the wcf side or web application side ?
Also I want to know that, if i choose web application side to store those files as set of folders, how it makes those folders shared and allow access to another different project (such as a desktop client need to upload files into that shared folder) ?
thank you all in advance.
I think the question can better be put like this:
save in a folder in the web application or close by and have the metadata stored in a database
grab the saved images from a database via WCF
The second approach would likely be rather slow. Grabbing information over a service, convert it, use an httphandler with the correct mime type to spit out the binary stream to the browser...
Most architectures cut down in the middle: save the images close, or in, the UI layer and have the metadata about them stored in the database. Retrieval of that information's mostly just a bunch of strings so easily retrieved.
Update for the new question:
Since winforms applications/other projects were not in your original question this deviates into something new. In that case you go for some of the following scenarios:
Use the WCF tier as a common ground and store the images behind that service. As I said it's going to be an extra to pull the byte arrays over.
Store the images in the Web UI tier and have a service (asmx or WCF one) to expose the images to your winforms client.
Make a share for the winforms client on the server where the web ui runs, and where the images are. Of course be sure to be respectful to security and possible hacks.
It depends on what the most used scenario is. My assumption is that the web ui layer will be mostly used and the the winforms are going to be used for image manipulation? If so there are ASP.NET third party controls available for such manipulation as well so the need for a winforms client would decrease.
This depends on how big you expect this thing to get.
If this is for the wider internet and you expect it to get big, having it on the webserver will make it difficult to scale up your application by adding new webservers to your web farm.
One approach would be to have the physical files uploaded to the webserver, to make the uploads quick for users, and then have a coordinator background service that is triggered by an upload, perhaps using a FileWatcher. This service would propogate the file to all nodes in the web farm so that subsequent requests to other nodes will find the file.
If it is a small application intended only for within a company, on the web server is okay, with the following conditions:
You have full control over the hosting server so that you can set up the appropriate folder permissions.
You write your file saving and retrieving code in such a way that it can be moved onto the lower tiers without too much pain. Do it through an interface and inject the implementation
I realise that this is going to be a fairly niche requirement and will almost certainly raise a few "WTF's" but here goes...
Within an ASP.NET Webforms application I need to serve static content from a local client machine in order to reduce up-front bandwidth requirements as much as possible (Security policy has disabled all Browser caching). The idea is to serve CSS, images and JavaScript files from a location on the local file system referenced by filesystem links from within the Web application (Yes, I know, WTF's galore but that's how it is). The application itself will effectively be an Intranet app that's hosted externally from a client but restricted by IP range along with standard username/password security. So it's pretty much a hybrid Internet/Intranet application but we can easily roll out packages of files to client machines. I am not suggesting that we expect nor require public clients to download packages of files. We have control to an extent over the client machines in terms of the local filesystem and so on but we cannot change the caching policy.
We're using UpdatePanel controls to perform partial page updates which obviously means that we need to Microsoft AJAX JavaScript files. Presently these are being served (as standard) by a standard resource handler within IIS/ASP.NET. Ideally I would like to be able to take these JS files and reference them statically from a client machine, and no longer serve them via an AXD.
My questions are:Is this possible?If it is possible, how do we go about doing so?
In order to attempt to pre-empt some WTF's the requirement stems from attempting to service a requirement with as little time and effort as possible whilst a more suitable solution is developed. I'm aware that we can lighten the load, we can switch to jQuery AJAX updates, we can rewrite the front-end in MVC etc. but my question is related to what we can quickly deploy with our existing application architecture.
Many thanks in advance :)
Lorna,
maybe your security team is going crazy. What is the difference between serving a dynamic HTML generated by the server and a dynamic JS generated by the server?
It does not make any sense. You should try talking them out of it.
what is the average size of pages and viewstate data. you might need to store viewstate in sqlserver rather than sending it to client browser every time.
Background:
I am an intermediate web app developer working on the .Net Platform. Most of my work has been defined pretty well for me by my peers or superiors and I have no problem following instructions and getting the job done.
The task at hand:
I was recently asked by an old friend to redo his web app from scratch. His app is extremely antiquated and he is getting overwhelmed by it breaking all the time. The app in question is an inventory / CRM application and currently each customer requires a new install of the app (usually accomplished by deploying it on a different domain on the same server and pointing to a new database).
Currently if any client wants any modifications to the forms such as additional fields, new features, etc my friend goes in and manually adds those fields to the forms, scripts, database etc. As a result all installs of this application are unique. There is no one singular source repository and no one single version of this app. Generally new features are overtime rolled into the other sites, but still this is done on an individual site by site basis.
I will be approaching this on a very modular basis. Initially I will be coding a module that will query an external web service for some data, display and store it, and periodically update it automatically. The next module will likely be for storing and displaying inventory data. This way I want to over time duplicate the current feature set of his app 100% but do it incrementally.
The Million Dollar Questions
I want to make the app have user
configurable form fields. The user
should be able to go to an admin
page, create a new forms page of a
certain category, and then specify
what fields he wants in there. He
could say 'create a new text field
called Item # and make it a
requirement" and that will get
stored somewhere. All forms will be
dynamically rendered to screen based
on what the user has configured. Is
this a good way to go about the
problem of having no idea what a
customer could want in a form? and
thus be able to store and display
form data of any sort ? What sort of
design pattern should I follow here?
I am familiar with asp.net and
the .net framework in general and
have decent knowledge of javascript,
html, silverlight, jquery, c# etc
etc. I can work my way around web
apps in a good way, but I am not
sure what sort of framework or tech
I should use to accomplish this
task. Would ASP.net 3.5 webforms be
the way to go? or should I look into
ASP.NET MVC? Do I use jquery and ajax for
complete decoupling of frontend and
backend ? or will a normal asp.net
page with some spattering of ajax
thrown in working with a codebehind
be the order of the day?
Just looking for general advice before I start.
I am currently thinking of using ASP.NET 3.5 webforms, jquery for clientside animation, ui, manipulation and data validation, and sqlserver + a .net or wcf webservice for backend.
Your advice is much appreciated as always.
I've recently implemented a white-label ecommerce system for an insurance company that allowed each partner to choose their own set of input fields, screens, and order the flow of the application to suit their individual needs.
Although it wasn't rocket science, it added complexity and increased development time.
Consider the user configuration aspect very carefully In hindsight both my client and their clients in turn, would have been happy with a more rigid system.
As for the tech side of your question, I developed my project in VS2005, using asp.net webforms and webservices with a SQLserver back end, so the stack that you're looking at is definitely capable of delivering a working product. ASP.net MVC will almost certainly help as far as testability goes.
The biggest thing I would change now if I was going to start again would be to replace the intermediate webservices with message based services using nServiceBus, MassTransit or the like. While the webservices worked fine, message based communication should be quicker and more reliable.
Finally, before you start to code, make sure that you understand the current system's functionality inside and out. If the new system doesn't do something that the old system did, it will be pretty obvious to the end users straight away.