I have noticed that while debugging my site, any request that results in a 404 and appears to refer to a path on the disk relative to my configured virtual directory is intercepted by Cassini and rudely replaced with a directory listing. I'm using Nancy Framework, but given that this problem appears to be at the web server level, I suspect Cassini would act the same way for MCV applications. I can't find any documentation on this "feature" other than a related commit message on the Cassini source that says "...directory listing only overrides 404 responses for directories".
I would much rather my development web server stop trying to outsmart my framework. I makes the debugging experience more than a little jarring. In an MVC framework the request URLs have nothing at all to do with file locations, so the fact I'm getting directory listings for some invalid requests and the correct 404 page for others gets annoying. Not to mention it's making several of my unit tests fail because they rely on auto-generated content in my 404 error pages (which I can't manually test either).
Is there any way to disable this functionality in Cassini? I know I could install IIS Express, but I'd rather not. Especially since my unit test runner and hosts file (this is a multi-domain application) are already configured just right.
Stop using Cassini and start using IIS Express instead.
Cassini has many shortcomings (SSL support being one, the problems you are seeing another and many more).
IIS Express is based on IIS code and is as close to IIS as can be while still being lightweight.
Related
I am developing an ASP.NET application and deploying to an IIS 7 server via WebDeploy. This is a single server (no web farm or anything like this). I've been using the same setup for two years with no problems. Since last night, the server seems to be "stuck" on the last version of the web that I deployed before dinner. I deployed a couple of new versions today, but the server keeps serving the old pages.
I have triple checked this. When I log into the server via RDP and I open a specific ASPX file, I can see that it's the new version I've just deployed, so the server is actually storing the new versions. However, when I visit the web site over HTTP from my computer, I get the old version of the file.
I have restarted the server (the whole machine, not just IIS). I have disabled the IIS cache. I have disabled the compression cache. I have tried from multiple client computers, including one from which I had never ever visited this site (so no client cache may exist). But nothing worked.
I am aware that similar issues have been reported, and I have read some posts about it. But I seem to have exhausted all possible checks. Any ideas on how to proceed? Thanks.
After much struggling, I managed to solve this issue. I deleted the whole web site from the server, and I deployed it from a computer other than my usual development machine. This fixed the issue.
However, I am still baffled at why this happened. It must have been a glitch with WebDeploy and/or IIS.
I have a problem deploying a .net core application via FTP which is hosted on IIS.
The main DLLs (core application) that I want to update just wont upload, FTP just gives me a generic permission error message. I think the reason is because they are in use because then I stop the application pool, upload and restart it works just fine.
But this isn't really a solution, are there any other methods of publishing that will alleviate this problem?
Edit:
"open for write: failure"
Is the only error I'm getting. I can't find anything online and the only solution I have is restart the app pool.
I found an answer and I figured it should be here for future Googling.
The issue is as I first expected IIS proxies the request to kestrel and that means the process is in use as far as Windows is concerned. There are three solutions.
The Good Solution
Have two (or more) VMs on azure behind a load balancer. Have a script which turns off the sites one at a time, does what it needs to do and turns them back on. Do this right and no downtime!
Intermission
Before I talk about the other solutions a little explanation. I have not been working with .NET for a long time but apparently there was this thing you could do where you add a app_offline.htm and it will temporarily take down the site for you.
In the context of IIS and .Net Core it also releases the process, which is really useful as it solves my problem! Although I had to visit the web page first for it to take effect, unless I'm mistaken.
The Bad Solution
Use an automated script to rename _app_offline.htm to app_offline.htm. Do the upgrade and then revert that change. Takes your site down, kind of ugly but scripting is always better than...
The Ugly Solution
You only have access to FTP, no remote admin or proper deployment process because... reasons.
Upload an app_offline.htm, upload as little as possible and hope it doesn't break anything before deleting or renaming app_offline.htm.
Also you would have to perform any DB migrations by using EnableAutomaticMigrations = true because you have no server access or scripting methods.
HTTP 404. The resource you are looking for (or one of its dependencies) could have been removed, had its name changed, or is temporarily unavailable. Please review the following URL and make sure that it is spelled correctly.
Requested URL: /webctrl_client/1_0/treeimages/Rminus.gif
I had this problem for a simple reason. Make sure you compile your site before deploying it. I had some pages that were compiled and some pages that were not compiled. And because I was testing on a Windows Server 2008 box, not a Windows 7 box.
It took me so long to figure out because when I tried viewing these pages on the application server (Windows Server 2008), it wouldn't let me view a non secure page (http protocol), because of Internet Explorer's security settings. I only had a binding for port 80 when testing on a browser on the server box. So I couldn't even see the aspx page that was compiled, let alone the aspx pages that were not compiled. A plain HTML page was visible on both boxes however, on the same site--that part was interesting. So that told me IIS was running fine. When I viewed the pages on another machine (running Windows 7) on that network, the compiled aspx page showed up fine because it was compiled. And the non-compiled pages did not.
Here are other possible issues:
(use correct version of .NET framework for commands below)
1.) Have you installed the .NET framework?
2.) Make sure ASP.NET 4.0 is registered. Run these commands to verify.
"%windir%\Microsoft.NET\Framework\v4.0.30319\aspnet_regiis.exe" -lv
"%systemroot%\system32\inetsrv\appcmd.exe" list apppool /managedRuntimeVersion:v4.0
"%systemroot%\system32\inetsrv\appcmd.exe" list config -section:system.webServer/isapiFilters
If not, run this:
"%windir%\Microsoft.NET\Framework\v4.0.30319\aspnet_regiis" -i -enable
3.) Make sure the ISAPI filters are turned on for the version you are using. Click server (not the site) in IIS 7. Then go to "Isapi Filters". Allow the ones you need.
4.) Make sure your application pool is running in the version of your compiled source code of your ASP.NET pages. Go to Application Pools in IIS, then right click the application pool for your site, and choose Advanced. Change the version to either 2.0 or 4.0. Make sure it's also 32 bit if you compiled your app as 32 bit.
... when reading forums those are the 4 solutions I came across most frequently. Mine was a bit simple, but confused me for hours because I hadn't installed the SSL certificate yet.
I'm developing a web site in a high-security environment. For example, we use CAC cards to authenticate users over SSL.
The site is a mix of VB.NET and C# on .NET 3.5 with some AJAX. The AJAX parts are now calling web services for things like Cascading Drop Down Lists.
We've been running VS2008 configured on our local PCs to use IIS instead of the default server (Cassini). However, some security policies were rolled out to the desktops over the weekend and, suddenly, we're not allowed to run IIS on our PCs anymore.
I already have some of our IT people trying to appeal for waivers for developers. In the meantime, I need to find a way to keep developing.
If I turn off the SSL requirement to the 'secure' part of the application (locally, my PC only) I can serve up some of the pages (using Cassini) when I hit "F5", but pages with web services just bring up "server application unavailable".
I need to be able to add some more functions into the existing web services, among other things, so the ability to single-step through the code is still a necessity.
I'm sure someone who is limited to using Cassini has found a way to Build/Debug pages in VS2008 when webservices are involved.
Thanks in advance.
EDIT: As it turns out, some links had "HTTPS://" hard-coded in them (I inherited these). Changing the link to "~\folder\page.aspx" allowed Cassini to properly serve things up.
Note that using Cassini is the default for VS2k8, even for Web Services. Try starting a new HelloWorld web service project and confirm if you can debug it.
OK So that worked. Then change the debugging options of your real project back to using Cassini rather than IIS. I wouldn't move the project (although backing it up might not be a bad idea) as you might be able to get IIS working again.
EDIT: So your actual problem wasn't to do with web services, just hard-coded URLs. (We have similar problem where much of the site works where ever the root of the website is, but some places, such as "main menu" links, expect the root to be the root of the webserver.)
You proabbly need to contact your IT department and have them open up something on the network so you can call the services - a port on a firewall, for instance.
Currently our dev team set up all the websites they're working on in IIS on their local machine. We're thinking of switching to using the built in ASP.NET development server instead.
Is this a good idea? What are the pros / cons of using the ASP.NET dev Server? Are there any gotchas we should be aware of?
Thanks.
NB: Running on Win XP / IIS 5 / VS2005
Edit:
Didn't realise it was called Cassini.. More answers for Cassini v IIS here.
There is nothing that the ASP.NET Dev WebService can do that IIS can't (You can set breakpoints etc, just attach the VS debugger to the ASP.NET runtime).
However, the ASP.NET Dev WebService does not represent a true production environment, and as such you can get caught by gotchas that you wouldn't expect when you deploy to production.
Because of that, I mandate that all development is done using IIS on a local machine. It doesn't take much work to configure a site in IIS.
It's a very good idea. Here are some reasons for:
You no longer need admin access to your machine for web development (it can still be helpful).
It's much easier to test a quick change and continue work, and faster iteration cycles are good.
It can simplify setup and deployment of your development environments.
The XP version of IIS has limitation that are not present in the Server version that Cassini side-steps.
The only argument I know against is that there are a couple very rare edge cases where the Cassini built-in server doesn't exactly mimic IIS because you're using odd port numbers. I doubt you'll ever run into them, and using Cassini as the primary dev environment does not preclude developers from also having access to IIS on the machine. In fact, my preferred setup is Cassini first for most small work, then deploy to my local IIS for more in-depth testing before moving code back to the shared source repository.
[Edit]
Forgot about url re-writing. You do need IIS for that. And an example of a limitation of the built-in XP IIS is that you are limited to one site in XP (can have multiple applications, but that's a different thing).
I had to switch (back) to IIS for one project, because I needed to set some virtual directories which is not possible on the ASP.NET Development Web Server.
As I stated here: https://stackoverflow.com/questions/103785/what-are-the-disadvantages-of-using-cassini-instead-of-iis your developers need to be aware that Cassini runs as the local user, which is typically an admin account for developers. The development will be able to access any file or resource that their account can, which is quite different from what they will see on an IIS 6 server.
The other thing that's a pretty big gotcha is debugging web services is much easier using IIS and vdirs rather than separate Cassini instances.
I know at one point I had an issue with Authentication not working as expected on Cassini (built in development server)
Also, if you need to test things like ISAPI plugins (a re-writer for example) I'm not sure how that's done on Cassini.
The constantly changing port is also rather disconcerting to me. Also, for each web project in your solution it fires up another instance of a Casini server, and each one takes anywhere from 20 to 50 MB of memory.
I use IIS all the time, it's pretty easy to setup, and you guys are already doing that...
I've used both methods and I prefer having IIS locally vs. using the built-in server. At very least you're more consistent with the final deployment setup.
Also, when using IIS 5.1, be sure to get JetStat IIS Admin, it adds functionality that is disabled out of the box on IIS 5, such as being able to setup multiple sites.
I have run into the following limitations with the asp.net dev server:
does not support virtual dirs. If you need them in your app, IIS seems to be your only choice
Classic asp pages dont run in dev server. So if you have a mixed web app (like I have at my client right now), IIS seems to be the solution
If you need an admin UI to configure settings, IIS works better
Of course IIS requires that you be a local admin.
Another distinction I noticed is that Cassini runs as a 32-bit process and you have no control over it, whereas you can control the application pool of your IIS app to disallow 32-bit (assuming your IIS is running on a 64-bit server). This becomes especially important if your web application is going to call APIs in 64-bit processes such as SharePoint Foundation/Server 2010. When you debug your web app with Cassini as your debug server, you'll get "The Web application at url could not be found. Verify that you have typed the URL correctly" type errors when instantiating objects. If you debug using IIS with the app running in an app pool that runs as 64-bit with an identity that allows access to sharepoint database then you'll be able to debug properly.
In VS12 the development server is way slow, takes a few seconds to download a 2kbyte file. This did not happen in vs10. When you have a bunch of jquery files and css this is a real problem. Also every page requeries all the css/js files. Very very slow regression testing.
The main issue I've run into with the dev server is SerializationExceptions with custom security principals stored on the thread context. Details here.