Hunchentoot comes with some default files that it serves without having any new handlers added. They are stored in /path/to/hunchentoot/www/. I can't figure out how to stop them from being served. I've tried looking at *dispatch-table* and *easy-handler-alist* but nothing is there, and I can't find a way to remove them anyway. Is there a way to at least make it 404 when one of those pages is requested without removing all the files from that directory (which would get restored on update)?
These files are found through the value of the acceptor's document-root slot. You can override this when creating the acceptor:
(make-instance 'hunchentoot:easy-acceptor :port 8080 :document-root "/my/docs/")
Just point that into an empty directory to serve nothing by default.
If necessary, the error page templates that are located in www/errors/ can be read from elsewhere by setting the error-template-directory slot in a similar way.
Related
I'm using FastAPI and the Uvicorn server to build a website, but when I make changes to the CSS files and reload the webpage it doesn't pick up these changes.
In fact, even when I switch off the server and reload it, Uvicorn still doesn't pick up the changes to the CSS file.
Previously, the server picked up the changes fine, what's caused this to change?
The issue is not with Uvicorn, but with your web browser stashing 'static' files in its cache.
FastAPI uses a method that designates a directory the 'static folder'. This tells the server that the files in this directory should remain constant and don't need to be downloaded every time a webpage is loaded.
Check inside your app's main module and look for the following piece of code:
app.mount(
"/your_static_file_web_path",
StaticFiles(directory="your_static_directory"),
name="your_name_for_static_app"
)
This function creates a second FastAPI app within your pre-existing one that handles all of your static files. The second argument defines your static file directory. Anything within that directory will eventually be cached by your web browser and further changes will not be loaded. This prevents your page from loading the updated CSS.
As noted in a comment above, if you want to work around this issue, you can hold shift and reload the web page - this is known as a hard refresh. A hard refresh will force your browser to re-download everything, including static files.
Problem
I have two controllers: RequestsController and ServicesController. Both contain Index actions.
When I browse to /Requests, it automatically runs the Index action, but for /Services or /Services/, it gives an HTTP 404 without even running the Index action.
Background
The route configuration is stock. The project also contains various classes under a top-level directory called Services:
Troubleshooting
The problem seems to be related to some sort of clash in naming between ServicesController and the top-level Services folder.
I can still access /Services/Index without a problem.
Debugging confirms that the Index method is being run for Requests but not Services when I don't specify the action name in the URL.
Renaming or removing the top-level Services folder causes the problem to stop happening.
I see. That's a common issue when you have a controller with the same name as a folder inside the root directory. Assuming that none of the files inside the Services folder are mapped by the StaticFilesHandler I think you can simply tell the routing system to map all "services" routes to the ServicesController by setting the RouteExistingFiles to true...
routes.RouteExistingFiles = true;
Update
I completely ignore the Content folder. You will need to prevent requests to the "Content" folder (or any other folder containing static files such as the Scripts folder) from going through the routing pipeline by explicitly specifying it BEFORE your routing-mapping logic...
routes.IgnoreRoute("FOLDER_NAME/{*pathInfo}");
consider my file
Test.mxml
output file
Test.swf
Each time i make some changes in Test.mxml corresoping swf file is generated.
But this is causing some problem in proxy server.
When i change the version of swf file generated its working fine(im able to see new changes as proxy server will load the new renamed file)(i tried versioning)
I cant see my changed swf file, its giving me cached swf file because of which the changes are not reflected.
A few approaches to handle this:
It may be possible to tell your proxy not to cache this file if you have any control over it.
Sometimes people use the "Random number" technique to prevent files from being cached. that is, in your HTML page that wraps your SWF; add a random number to the SWF location. Conceptually like this myswf.swf?someRandomNumber .
Every time you deploy a new build you could change the filename.
You can also try having your browser send the no-cache headers, which causes the (WebSphere Edge) proxy server to dump its cached copy too. In Firefox, at least, Shift-Reload does this. I think that's true in IE and maybe Chrome too.
The basic idea is we have a test enviroment which mimics Production so customErrors="RemoteOnly". We just built a test harness that runs against the Test enviroment and detects breaks. We would like it to be able to pull back the detailed error. But we don't want to turn customErrors="On" because then it doesn't mimic Production.
I've looked around and thought a lot, and everything I've come up with isn't possible. Am I wrong about any of these points?
We can't turn customErrors on at runtime because when you call configuration.Save() - it writes the web.config to disk and now it's Off for every request.
We can't symlink the files into a new top level directory with it's own web.config because we're on windows and subversion on windows doesn't do symlinks.
We can't use URL-Mapping to make an empty folder dir2 with its own web.config and make the files in dir1 appear to be in dir2 - the web.config doesn't apply
We can't copy all the aspx files into dir2 with it's own web.config because none of the links would be consistent and it's a horrible hacky solution.
We can't change customErrors in web.config based on hostname (e.g. add another dns entry to the test server) because it's not possible/supported
We can't do any virtual directory shenanigans to make it work.
If I'm not, is there a way to accomplish what I'm trying to do? Turn on customErrors site-wide under certain circumstances (dns name or even a querystring value)?
If you have customErrors="On" or "remoteOnly" the error detail doesnt go away right? I mean you can still access it and log it to another source using your custom error page.
Why dont you just have your custom error page in production log the information somewhere your harness can access it, like a message queue or App event log? When the harness comes across a break it just has to be smart enough to do a lookup in the right place for the full error info.
Another thing to consider anyway.
As part of a CMS, I have created a custom VirtualPathProvider which is designed to serve a single file in place of an actual file structure. I have it set up such that if a file actually exists on the server, that file will be served. If the file does not exist, the virtual content stored for that address will be served instead. This is similar to the concept of serving a website from files stored in a database, though in this case the content is stored in XML files on the server.
This setup works perfectly when a request is made to a specific page. For example, if I ask for "www.mysite.com/foobar.aspx", the content that is stored for "foobar.aspx" will be served. Further, if I ask for "www.mysite.com/subdir/foobar.aspx", the appropriate content will also be served.
The problem is this: If I ask for something like "www.mysite.com/foobar", things begin to fall apart. If the directory exists on disk (and doesn't have a configured default page in IIS, such as index.aspx), I will get a "Directory Listing Denied" error. If the directory does not exist, I'll simply get a 404 - Resource Not Found.
I've tried several things, and so far nothing I've done has made a bit of difference. It seems as though IIS is simply noting the nonexistence of a directory (or default file in an existing directory) and serving up its own error code, without ever asking my application what to do with the request. If it ever did get to the application, I would be able to solve the problem, but as it stands, I'm quite lost. Does anyone know if there is some setting in IIS that is causing this?
I've looked for every resource I can find on the subject, and am coming up empty. I know this should be possible, because I have read tutorials on serving content from both databases and ZIP files. HELP!
p.s., I am running IIS6 and .NET 3.5
IIS will only pass a request to the ASP.NET process if it is configured to do so for the particular extension. The default is aspx, ascx, etc. In other words, if you request a .html file, ASP.NET will never see that HTTP request. Likewise for empty extension.
To change this behavior, add a wildcard mapping to the ASP.NET process. Load IIS Manager, go to the Properties for your web site and look at the Home Directory tab. Click on "Configuration" and there you will see the extension-to-applicaiton mappings.