I realise that this is going to be a fairly niche requirement and will almost certainly raise a few "WTF's" but here goes...
Within an ASP.NET Webforms application I need to serve static content from a local client machine in order to reduce up-front bandwidth requirements as much as possible (Security policy has disabled all Browser caching). The idea is to serve CSS, images and JavaScript files from a location on the local file system referenced by filesystem links from within the Web application (Yes, I know, WTF's galore but that's how it is). The application itself will effectively be an Intranet app that's hosted externally from a client but restricted by IP range along with standard username/password security. So it's pretty much a hybrid Internet/Intranet application but we can easily roll out packages of files to client machines. I am not suggesting that we expect nor require public clients to download packages of files. We have control to an extent over the client machines in terms of the local filesystem and so on but we cannot change the caching policy.
We're using UpdatePanel controls to perform partial page updates which obviously means that we need to Microsoft AJAX JavaScript files. Presently these are being served (as standard) by a standard resource handler within IIS/ASP.NET. Ideally I would like to be able to take these JS files and reference them statically from a client machine, and no longer serve them via an AXD.
My questions are:Is this possible?If it is possible, how do we go about doing so?
In order to attempt to pre-empt some WTF's the requirement stems from attempting to service a requirement with as little time and effort as possible whilst a more suitable solution is developed. I'm aware that we can lighten the load, we can switch to jQuery AJAX updates, we can rewrite the front-end in MVC etc. but my question is related to what we can quickly deploy with our existing application architecture.
Many thanks in advance :)
Lorna,
maybe your security team is going crazy. What is the difference between serving a dynamic HTML generated by the server and a dynamic JS generated by the server?
It does not make any sense. You should try talking them out of it.
what is the average size of pages and viewstate data. you might need to store viewstate in sqlserver rather than sending it to client browser every time.
Related
We have a non-standard Kentico architecture which Kentico have advised is supported as long as synchronization of physical files between load balanced servers is disabled and handled manually. What is the correct way to manually synchronize web farm server files? I wondered about using a tool like DirSync but assume this would require one server to act as the primary, whereas with Kentico a new media file, for example, may be initially saved to any of the physical servers.
I'm hoping to identify a definitive solution to this issue. Thanks.
Kentico web farm by default synchronizes physical files automatically if the web farm is working properly. As each request can be served by different server Kentico serializes file binary into Database which is shared by all servers and then re-creates file on the server where it is missing.
I'm not aware of any situation where web farms are supported, but file synchronization isn't. It's either all or nothing, there is no middle solution.
Can you be more specific of why the synchronization of physical files is not working on your end? As long as all servers see the database (which they should otherwise the WF is not working at all) the file synchronization will work.
PS: If your files are not synchronized, go to Web farm -> Tasks application and check how many tasks are there. If there are no tasks (or very few which are being deleted constantly) then your web farms are working, if there are tasks older then few minutes then your web farms are not working at all.
I read the thread above and would recommend you take a look at this tool from BizStream: https://devnet.kentico.com/marketplace/modules/compare-for-kentico
I haven't gotten to play with it myself, but they are a top notch shop so I can bet its a top notch product.
Otherwise you are going to have to go the custom sync code.
We've tried to do moves via the SQL Tables and it is 'possible', but the amount of interconnected relationships just make it quite unrealistic to build or support.
I came across a case study few days early. It is related to a web application architecture.
Here is the scenario,
There is a single web service used by say 1000 web applications. This web service is hosted on a particular server. If web service hosting location is changed, how the other applications come to know about this change ?
Keeping it in web.config doesn't seems to be a feasible solution as we need to modify web.config files for all the applications.
Keeping these settings in a common repository and let all the applications use it for web-service address was came in my mind, but again there is a question of storing this common repository.
I am just curious to know about how this could be achieved with better performance.
Thanks in advance for any kind of suggestions.
do you have full access or control over all those web applications consuming that web service? if so, you could have a script or some custom code which updates all their web.config(s) at once. it seems too much work but in fact in this way you have more control and you could also, eventually, point to the new url only some applications and leave some others on another url.
the idea with the setting in a centralized database gives you faster update propagation which could also be bad in case of errors and then you have all applications referring to the same place and no way to split this. Then you have anyway to connect to a centralized database from all of them and maybe you should add a key to their web.config(s) with the connection string to that database, then, in case that database is not reachable or is down, the web applications will not be able to consume the web service simply because they cannot get the url of it.
I would go for the web config, eventually you could have a settings helper class that abstract the retrieval of that url so the UI or front end does not know from where that url comes from.
anyway, do you plan to change the url of a web service often? wouldn't be better to copy it to a new url but to also keep it available on the current url for a while?
another advantage of web.config approach is that everytime you update and save it the application is restarted while a change in a database might take a while to be detected in case you have some caching mechanism,
hope this helps.
Davide.
I have a web application (MainApplication) where many of the pages contain a custom Web Control that looks for some content in a cache. If it can't find any data within the cache, then it goes out to a database for the content. After retrieving the content, the Control displays the content on the page.
There is a web application (CMS) in a subdirectory within the aforementioned web application. Users use this CMS to update the content pulled in by the MainApplication.
When a user updates some content using the CMS, I need the CMS to clear the relevant portion of the cache used by the MainApplication. The problem is that, as two different web applications, they can't simply interact with the same static cache object.
The ideal solution would be to somehow share an instance of a cache object between both web applications.
Failing that, what would be the best (performance-wise) way of communicating between the two web applications? Obviously, writing/reading to a database would defeat the purpose. I was thinking about a flat file?
Update
Thank you all for your help. Your wonderful answers actually gave me the right search terms to discover that this was a duplicate question (sorry!): Cache invalidation between two web applications
We had the exact same setup in a previous project i worked on, where we had one ASP.NET Web Application (with MCMS Backing), and another ASP.NET Web Application to display data.
Completely different servers (same domain though).
However, when a "editor" updated content in the CMS application, the UI was automatically refreshed.
How? Glad you asked.
We stored the content in SQL Server, and used Replication. :)
The "frontend" Web Application would read the data from the database (which was replicated by the CMS system).
Now - we don't cache this data, because in the database, we actually stored the markup (the HTML) for the control. Therefore we dynamically re-rendered the HTML.
Why is that "defeating the purpose"?
You can't get one application to "invalidate" the cache on another application.
If you're going down this path, you need to consider a distributed caching engine (e.g Velocity).
One option that comes to my mind in such scenario is using Velocity distributed cache mechanism. Do read about it and give it a try if possible http://msdn.microsoft.com/en-us/magazine/dd861287.aspx
In ASP.NET there is the notion of Cache Dependency. You can have a look here: http://www.codeproject.com/KB/web-cache/CachingDependencies.aspx or http://www.devx.com/dotnet/Article/27865/0/page/5.
There is also the Enterprise Library Caching Block available here that adds some feature to the standard stuff: http://msdn.microsoft.com/en-us/library/ff649093.aspx
Now, if you're running on .NET 4, there is a new System.Runtime.Caching namespace that you should definitely use: http://msdn.microsoft.com/en-us/library/system.runtime.caching.aspx
This article here "Caching in ASP.NET with the SqlCacheDependency Class" is quite interesting: http://msdn.microsoft.com/en-us/library/ms178604.aspx
I'm creating a web application using asp.net & WCF as 3 tier architecture, which is mostly looks like a social website. Users can register with the system and they can upload their profile images, documents, video clips etc. So, what i want to know is what is the best way to store those files? In the wcf side or web application side ?
Also I want to know that, if i choose web application side to store those files as set of folders, how it makes those folders shared and allow access to another different project (such as a desktop client need to upload files into that shared folder) ?
thank you all in advance.
I think the question can better be put like this:
save in a folder in the web application or close by and have the metadata stored in a database
grab the saved images from a database via WCF
The second approach would likely be rather slow. Grabbing information over a service, convert it, use an httphandler with the correct mime type to spit out the binary stream to the browser...
Most architectures cut down in the middle: save the images close, or in, the UI layer and have the metadata about them stored in the database. Retrieval of that information's mostly just a bunch of strings so easily retrieved.
Update for the new question:
Since winforms applications/other projects were not in your original question this deviates into something new. In that case you go for some of the following scenarios:
Use the WCF tier as a common ground and store the images behind that service. As I said it's going to be an extra to pull the byte arrays over.
Store the images in the Web UI tier and have a service (asmx or WCF one) to expose the images to your winforms client.
Make a share for the winforms client on the server where the web ui runs, and where the images are. Of course be sure to be respectful to security and possible hacks.
It depends on what the most used scenario is. My assumption is that the web ui layer will be mostly used and the the winforms are going to be used for image manipulation? If so there are ASP.NET third party controls available for such manipulation as well so the need for a winforms client would decrease.
This depends on how big you expect this thing to get.
If this is for the wider internet and you expect it to get big, having it on the webserver will make it difficult to scale up your application by adding new webservers to your web farm.
One approach would be to have the physical files uploaded to the webserver, to make the uploads quick for users, and then have a coordinator background service that is triggered by an upload, perhaps using a FileWatcher. This service would propogate the file to all nodes in the web farm so that subsequent requests to other nodes will find the file.
If it is a small application intended only for within a company, on the web server is okay, with the following conditions:
You have full control over the hosting server so that you can set up the appropriate folder permissions.
You write your file saving and retrieving code in such a way that it can be moved onto the lower tiers without too much pain. Do it through an interface and inject the implementation
We have a web application that uses AJAX to talk to an ASP.NET web service. We would like to write another version that can be used offline. We need to be able to re-use our existing code as much as possible. What approaches should we consider?
The app is currently using XmlHttpRequest to get dynamic data from the server. Obviously the offline version will not be able to talk to the server, but it does need to talk to something! I'm sure installing IIS or Cassini on the client would work, but I was hoping for a simpler solution. Is there no other way for JavaScript to talk to some external code?
There are plenty offline web apps nowaday. It simply evolve from AJAX.
For example:
WoaS (wiki on a stick / stickwiki), Tiddly Wiki,
Google doc and Gmail is going to be offline.
You don't need a webserver to run these webapps in offline mode. Just store the required data, scripts on the client side (usually as XML).
One of the possibilities would be to use Cassini. This is a web server that acts as a host for the ASP.Net runtime. You can host Cassini in a Windows application or a Windows Service. In this scenario you do not have to rewrite the web app and the web service.
Most other solutions do require a rewrite of both your web app and your web service. Depending on the way you have written the existing app you can reuse more or less code.
Have you considered HTML5 with application cache and offline storage?
If you hope to create an "offline" version of your package your biggest issue by far will be the need to install your site into a local copy of IIS (registering a virtual directory, etc.). I pursued this briefly a few years ago and gave up in frustration. It can be done: a number of software vendors such as DevExpress do this so you have local copies of their demonstration projects. Indeed, I was able to do this. The problem was the classic "it works on my computer" syndrome. There was simply no way to guarantee that most of my end-users had anywhere near the technical proficiency to make this work.
Thus, I would strongly recommend that you not pursue this path unless you have very technically proficient users and a huge support staff.
But there is one more very important question: did you abstract all data access code to a DAL? If not, then you have a lot of work to do in managing data access as well.
Update: user "Rine" has recommended Cassini. I just wanted to let you know that I pursued Cassini and another 3rd-party web server as well. I think that there are licensing issues with Cassini but may be wrong - it has been awhile. However, I do distinctly remember running into barrier after barrier with this approach and very little documentation to help me out.
if you want a web application run offline, you need a webserver (IIS for ASP) bound to the localhost (127.0.0.1) address. After this so can access your web application by typing http://127.0.0.1/ in your web browser the same way as you do online.
If your AJAX relies on XMLHttpRequest's, you can:
Make the static versions of XML's you get over XMLHttpRequest and put then into a folder on disk.
Rewrite your XMLHttpRequest URL's so that they point to files on disk.
Rewrite your XMLHttpRequest's so that they don't check status (it's always 0 for the file:// protocol.
All JScript works on file:// pages as well as on http:// ones.
Of course it's not the best way to develop static pages, but it may save you some time on rewriting.
I havent come across any framework specifically built for asp.net like the ones available for PHP or RoR.
Here is a good article by Steven to get you started with HTML 5 and ASP.Net Creating HTML 5 Offline application
Obviously the offline version will not be able to talk to the server, but it does need to talk to something!
Enter HTML5 LocalStorage. It works like a database and enables you to put data on your client. Indeed you have to rework parts of your code in javascript and transmit it to the client, but then it would work offline.
Local Storage works like this:
- Setter: window.localStorage.setItem(KEY, VALUE)
- Getter: window.localStorage.getItem(KEY)
- Remove: window.localStorage.removeItem(KEY)
To get the main page working offline you need to create a manifest. This is used to store complete sites on the client. Please refer to this for more information about manifests:
http://diveintohtml5.info/offline.html
You want to build a web application to work offline?? It can't be done.
You could split the interface code from the rest (in diferent dlls) and create a windows application to mimic the behaviour of your web application. This way you have 2 distinct user interfaces but the same code for business rules and data access.
I don't really see any other way...