I've just started studying DocFX. According to its official guide we build content with a command similar to docfx docfx.json --serve and then view the generated site from http://localhost:8080. My question is: if DocFX is a static site generator, why does it serve content via a web server? Why does it not just say click index.html to view the generated site ? Is there a difference?
DocFX does generate static content, however the main index.html page will attempt to load some assets such as table of contents (toc.html) using an XMLHttpRequest from the browser. Such requests will be blocked by the browser if you have loaded the site by opening the index.html page from disk.
If you try, run the F12 dev tools in Chrome (or browser of your choice) and you will see warnings such as:
Access to XMLHttpRequest at 'file:///your-path/_site/toc.html' from
origin 'null' has been blocked by CORS policy: Cross origin requests
are only supported for protocol schemes: http, data, chrome-extension,
edge, https.
As a result, the site really needs to be loaded from a web server over HTTP to get the best result
Related
I have a VS project that has http links and a few https links to other domains. The site is hosted by GoDaddy. Once i publish the files, the formatting is totally off. it appears , based on Chrome’s console, all of my http links (css and js files) are making a call to https. the source document has http. i can not figure out what is making this happen.
in the source it says <link href=“../asset…
it works in debug but once published, in the console the same call is “<link href=https://…”
I have self hosting app on ASP.NET Owin, that can show html page with login, password, calls web services and show results from web api. Suddenly I hit to the wall, that any static data .json files responds with 404. It's developed is using PhysicalFileSystem.
Part of appliation bundled with javascript frontend that hosts also inside selfhost. It's only bundle.js and bundle.map for debug. I tried turning off cors. But I can't see any .json file. If I rename it to .txt I see this file.
Please recoommend debug method to trace this issue or may be you know such behavior.
I have seen this type of question asked but the other way. namely to redirect http to https.
This my scenario. I have a asp.net core app. I deploy it to my test server.
When I view in an eternal browser the js and css etc will not load up. Upon inspection it is redirecting everything to https. I do not want this.
How can i force it to 'stay' with http?
just remove the app.UseHttpsRedirection(); in Configure method in Startup class.
And nothing would be via HTTPS unless explicitly requested.
Goto project properties --> Debug --> uncheck Enable SSL
One more hint:
remove the schema from the URL, so it will load the referenced scripts and css files according the client's URL (http or https)
<script src="//cdn.mysite.com/myscript.js" type="text/javascript"/>
I've got a perfectly functioning wordpress site on Azure AND a main website as different Azure website. I want the main website to be a reverse proxy server so that all the wordpress content is served to the outside world from our main domain.
All the web server jiggering I've done appears to work properly: the browser finds/displays the content and the proxy web server has re-written all the URLs so that 'myblog.azurewebsites.net' is replaced by 'mydomain.com/blog' (I can see that it's all correct when viewing the html page source).
However, there are a few files that the browser can't access, most notably the js files in the wp-includes folder. A 403 Forbidden error is returned.
Why would the original site send the article content but not the js files? Is there something special about the wp-includes folder? Do I need to loosen some permissions on it?
I found this somewhat related discussion, but can't find a way to modify my wp-config.php file so everything works.
I run wordpress site and am using Akamai for caching. I have a link on every page so the user can switch between desktop and mobile site at any point. This link once clicked stores cookie which is passed to server with every request and so server knows if needs to return mobile site or desktop version.
Now when I access via "origin" it all works fine as it skips Akamai caching. However when accessing site as normal, so with Akamai caching, the link doesn't do anything. I'm assuming its because as far as Akamai is concerned its exactly the same url request and as Akamai has already its cached version it returns the same page ignoring the cookie all together.
Is there any way to tell akamai directly from my php files in wordpress not to cache html and do it only for images,css etc?
Or maybe is there a setting in Akamai itself where this can be specified?
If not then what other options would I have to get this working?
Yes there are a number of ways to do this. The easiest way would be to do a no cache on specific file extensions such as .html
You can tweak the files to be or not to be cached in AKAMAI through "Configuration Attributes and Digital Properties" screen.
On "Time To Live Rules", you can define path and their caching policy.
Apart from that if you want to validate if a particular web resource id rendered from AKAMAI or not, you can use Fiddler and a particular PRAGMA header.
Refer link Validate if web resource is served from AKAMAI (CDN)?? for more details.