I've got a perfectly functioning wordpress site on Azure AND a main website as different Azure website. I want the main website to be a reverse proxy server so that all the wordpress content is served to the outside world from our main domain.
All the web server jiggering I've done appears to work properly: the browser finds/displays the content and the proxy web server has re-written all the URLs so that 'myblog.azurewebsites.net' is replaced by 'mydomain.com/blog' (I can see that it's all correct when viewing the html page source).
However, there are a few files that the browser can't access, most notably the js files in the wp-includes folder. A 403 Forbidden error is returned.
Why would the original site send the article content but not the js files? Is there something special about the wp-includes folder? Do I need to loosen some permissions on it?
I found this somewhat related discussion, but can't find a way to modify my wp-config.php file so everything works.
Related
I'm building a Flask/Apache2 web app that includes large video files that a user must be authenticated to view. I'd like to use Flask to serve the video files after authentication and not having to expose the media directory instead of just the plain HTML video tag that anyone with the URL can access.
I've tried using flask's send_file which works for things like images and thumbnails but large videos take forever to buffer and freeze up.
I see that NGINX has X-Accel-Redirect for the redirection of video files on my server (hidden, not in the web directory) to go to the user through flask but I'm not that sure how to implement this with Apache2.
Running Flask, Apache2, Ubuntu
TLDR:
Basically trying to send large files to the user through flask (not using send_file as this freezes) to enable authentication and not having to expose the whole directory to the internet with the files as I would with the video HTML tag. NGINX has X-Accel-Redirect, what's the apache2 equivalent for use in flask?
Thanks!
I have a VS project that has http links and a few https links to other domains. The site is hosted by GoDaddy. Once i publish the files, the formatting is totally off. it appears , based on Chrome’s console, all of my http links (css and js files) are making a call to https. the source document has http. i can not figure out what is making this happen.
in the source it says <link href=“../asset…
it works in debug but once published, in the console the same call is “<link href=https://…”
Specific to Web Apps hosted on Microsoft Azure, is there a way to prevent the mydomain.azurewebsites.net URL from being indexed by search engines? I'm planning to use a web app as a staging website, and don't want it to accidentally get indexed.
I know I could add a robots.txt file to the project with everything set to no-index, but I don't want to ever accidentally publish it to the production site (or alternatively, forget to publish it to the staging website).
Is there a setting in Azure that will prevent the ".azurewebsites.net" domain from being indexed? Or if the robots.txt file is the only way, how do you keep it organized so that the right robots.txt file is published to staging and production, using ASP.NET Core.
Another option is to enable Authentication against your Azure Active Directory from the Authentication/Authorization tab in your App Service's settings for development and staging environments.
This way users will be forced to login to access those apps.
Documentation: https://learn.microsoft.com/en-us/azure/app-service/app-service-authentication-overview
https://learn.microsoft.com/en-us/azure/app-service/app-service-mobile-how-to-configure-active-directory-authentication
You can publish robots.txt to your staging server once. This can be done via FTP or via your SCM site. Once you publish this file, web publish will not remove additional files on the server (including your robots.txt file) unless you select "Remove additional files at destination" in your web publish settings.
So the robots.txt file will hang around forever on your staging server unless you remove it. Then you do not need to include robots.txt in your project or solution, and not risk accidentally publishing it to your production environment.
Restrict access based on hostname and request IP
Unless you need your staging slot to be accessible to a wide range of dynamic IPs, you could consider using the URL Rewrite module and adding rule[s] to your web app config to disallow traffic except for a few known IPs, but make those rules conditional on the HOST header matching the staging host (mydomain.azurewebsites.net), so they can never apply on the production hostname.
The details in the question here show a similar type of setup.
I am in the process of migrating an ASP.net to an HTML website. This is new to me as I only usually do designs for websites.
The URL is the same and so are all the file names (apart from the extensions, of course). I have no clue on ASP.net websites which is where I am getting stuck. I am running the site on a Windows Server and would like to be able to re-direct all the .aspx files to the corresponding .html files
I have read up about .htaccess but have had no luck, I think I need to be running Apache?
Does anybody have any idea how I can solve this? There are similar questions on here but the answers aren't dumbed down enough for me!
Also, when i try the htaccess option I get this error:
Server Error in '/' Application. The resource cannot be found. Description: HTTP 404. The resource you are looking for (or one of its dependencies) could have been removed, had its name changed, or is temporarily unavailable. Please review the following URL and make sure that it is spelled correctly. Requested URL: /about-us.aspx
KISS solution: If there are only a few number of pages and there really is no logic in the .aspx pages, you could always just load all of the possible web pages, and save them as .html pages via the browser functionality. Google Chrome allows you to right-click on each page and choose Save as... Internet Explorer & Firefox have similar functionality. These pages that you save can then be uploaded to your Go-Daddy site.
I recently migrated my site from a aspx site to wordpress. The wp site is now hosted on rackspace cloud.
When I go to index.aspx I get the following message:
Server Error in '/' Application.
The resource cannot be found.
Description: HTTP 404. etc.
Previous times I have done a migration like this I uploaded a file index.aspx with a redirection inside and that worked well, but now it doesn't seem to find the file at all.
Nor did a redirect from the htaccess, nor the redirection plugin. I just get that same message.
Any ideas?
I believe Cloud Sites supports the ability to serve both PHP and ASP content from the same "server". I suspect you have "Windows Technology" enabled on your account which is attempting to serve the index.aspx file and ignoring the .htaccess file. The following articles should help:
Cloud Sites KB - How can I redirect from ASP/.NET to PHP?
Cloud Sites KB - How do I enable a secondary technology?
Your server may not be configured to treat *.aspx as html files.