I'm moving an old VBScript web site over to ASP.NET, so I'm starting to use master pages instead of #includeing lots of other files with server side VBScript in them.
How can I stop the pages (as in the ones that are based on master pages) from being generated once and then stored? If I make a change to the master page (or any page based on them) those changes are not visible, because the web server is still giving out the previous versions.
It sounds like you have caching enabled, as normal behaviour without caching is to regenrate.
That shouldn't be the case, are you sure it's not a caching issue?
Related
I installed VS 2012 on my work PC, and for the life of me when I go to add a new web form, I cannot see or find the 2 options for Select Master Page or Place code in separate file.
Any ideas on how i can find them?
I've just been fooled by just that for the past hour. What I've found is that the 'old' method of using the menu option 'Add web page using master' (or similar) has vanished, and been replaced by 'Content Page'. When you select this, you're then given the option of choosing a master page.
Incidentally, I don't get the option of starting a web site project, which may be why I'm not seeing this either.
You should have web form with master page in the add items?
Placing code in separate file is a different concept called code-behind. You will have an ASPX file and a cs / vb file which handles the events.
e.g.
Default.aspx
Default.aspx.cs
EDIT: Based on comment.
You may be using a Web Site on your laptop, but a Web Application in work. See Web Application Projects v's Web Site Projects.
Take a look at ASP.NET Web Site or ASP.NET Web Application?, this might also help in deciding which to use. Generally, I always go for Web Application.
You can use VS to convert from a Web Site to a Web Application, but AFAIK, you can't do it the other way round unless you create a new project and copy across the relevant parts - which could be a big job depending on the size of the site.
If I'm overriding an active ASP site but will be replacing all existing .asp pages with .xhtml pages, will I have any issues that may arise. Forms have been removed from the site at this time so its virtually static but will be making modifications in the future; ie: video, forms, etc.
At this time all production and testing have been on a demo server and I will taking the site live this weekend.
There shouldn't any problems as long as you did you job right.
How hard that job was depends a lot one the complexity of the old ASP site.
You removed all forms, fine. What about links with database look ups like:
book.asp?isbn=6546465445, how do you handle this using static pages. You would have to create a page for each possible book and remap the links.
Did you change all links from *.asp to *.xhtml or just remapped *.asp to a static handler on the server?
In the past I had to take a very dynamic ASP site and make it work from a CD-Rom without a server.
A good link-checking tool to test the new site is very helpful.
My setup: ASP.NET 4.0 Web Site running on IIS 6.0.
I have a site with many landing pages with virtual URLs, all being served by a single physical landingpage.aspx file (via ASP.NET 4.0 Routing), in which I use the OutputCache directive, caching the page on the server for 24 hours.
Recently, I started to experience a problem that the __doPostBack JavaScript block was missing from some of the generated pages. There is a LinkButton in the page (inside a webusercontrol), so the JS block should be there, but sometimes it isn't. This is naturally leading to JS errors in the browser upon clicking the LinkButton.
My suspicion is that maybe on the first visit to a given URL handled by the above mentioned physical .aspx file it might have been a visit by a client (a browser or a search bot) which maybe was considered by ASP.NET as a down-level browser, and therefore the doPostBack was not output into the generated cached version of the page, and then this wrong cached version is served to all the subsequent visitors...? On the other hand, I'd say ASP.NET is smart enough to generate different cached version for different levels of browsers, isn't it?
Nevertheless, my question is: can I find the cached files that are served to the visitors somewhere on the server, and somehow check if my assumptions are correct? Also, I would like to disable this ASP.NET recognition of browsers altogether and serve the same content to every browser, just as it would serve to any modern browser.
Thanks a lot for any advice and help.
Answering my own question: Confirmed that the website was sending back HTML without __doPostBack() for unrecognized browsers. Adding ClientTarget="uplevel" to the # Page directive at the top of the .aspx page in question solved the problem and __doPostBack() block is now always present.
If we use an aspx page with a Caching Profile, the server caches images that are loaded with the aspx page. So if ten clients load the same image through the aspx page (same url), for one client the image is gotten out of the db, for the nine others it is cached.
When we use a HttpHandler, this doesn't happen. The image is always fetched from the database. We have tried all different settings without any success. (we have checked this link and have not been able to cache on server side).
I can't answer based on experience of using the caching profile, so I'm not sure if this helps.
Under the covers, ASP.NET WebForms are driven by HttpHandlers - written by MS (as you'd expect). When you write you own Http Handler you don't automatically get all the functionality that the System.Web.UI.PageHandlerFactory handler has (the one that by default looks after .aspx pages/requests) - you have to bring it in (or develop it) yourself.
Maybe this is the problem you have - maybe the Caching Profile capabilities are being leveraged by the aspx pages because the System.Web.UI.PageHandlerFactory is already "integrated" with it out of the box, where-as when you write your own they just aren't there (by default) - and hence the they don't work.
I'm in the process of creating a SharePoint web part that needs to use a single-threaded COM component. I've discovered that in order to make this work well, I need to add the AspCompat="true" directive to the page where the web part will live. The problem is I can't seem to set up such a page.
I created a new blank web part page via the regular browser interface, then added AspCompat to it using SharePoint Designer. But that causes it to become un-ghosted and the SafeMode parser says that directive isn't allowed. I then modified the blank web part page template to include the directive and created a new page, but got the same error.
I basically need to set up a page within my SharePoint site that is stored in the file system, has that directive, and can contain my web part. How would I go about creating such a page?
I recently answered this question with some instructions that includes how to supply new site content page templates that you can use to host web parts. I think it's likely to work for you.
In your case, your aspx page template would have that AspCompat directive built-in.
According to Microsoft SharePoint developer support, it is simply not possible to use AspCompat within a SharePoint site. The SPPageParserFilter class, which is the one that disallows certain page directives, is apparently involved in processing all pages, even those that are on the file system. See this blog post for more details.