Adobe CQ CDN integration - cdn

I am using AEM 5.6 and trying to integrate it with CDN. I was going thru web on this but could not found much apart from http://cq-ops.tumblr.com/post/110829806684/how-to-integrate-aem-with-a-cdn-such-as-akamai
From where i can start as a newbie?What are standards & best practices for same? How CDN cache works & how it is invalidated?Also what are CDN cache invalidation mechanism which can be used?

You can integrate Akamai with Author as well as dispatcher URL. But since author contents are manly dynamic, you do not want to cache them in Akamai. You can use Akamai only to safeguard as a firewall. But in case of dispatcher URL/live URL you can cache all the static contents in akamai. TTL is configurable in the configurations in there portal.

Related

Redirect & mask URL to Azure subdomain

I have an ASP.NET MVC web app running on Azure as generic-site.co. It's a white-label site that supports a number of subdomains: acme.generic-site.co, globex.generic-site.co, initech.generic-site.co, etc. Browsing to each of them changes branding on the pages, but the underlying functionality is exactly the same.
Meanwhile I have an external domain name acme-site.com hosted by GoDaddy. I want to redirect this specifically to the acme.generic-site.co subdomain, but I also want to maintain acme-site.com as the root URL for any further browsing on that site, allowing users to have a pure acme experience without any indication of the underlying generic-site-ness.
I've tried to do this using GoDaddy's Domain Forwarding with masking, but I ran into CSRF issues almost immediately.
Is there any way I can achieve this? I suspect IIS URL rewriting might be helpful, but I'm at a loss as to how to proceed.
Don't use Domain Forwarding with masking.
Just add custom domain acme-site.com to Azure Web App.
And you may need to do one of the following:
Add a middleware or something that change Host in HTTP request header from
acme.generic-site.co to acme-site.com.
Adjust the application to
load correct branding when using domain acme-site.com.
It is probably easier to use IIS Url Rewrite module as you mentioned in your question. There are several examples on how to do this. Please start with this post by Scott Forsyth: https://weblogs.asp.net/owscott/iis-url-rewrite-redirect-multiple-domain-names-to-one

Hosting WebAPI on a HTTPS Azure Web Role and HTML Javascript on HTTPS Azure CDNs?

I have an AngularJS WebAPI application that has a Javascript front-end. The front end makes calls to the back-end WebAPI for data. In the future there may well be more than one front-end making calls to the back-end.
I would like to change this application to use HTTPs and am looking into how to best architect this. The way I see it there are two ways (maybe more).
(1) Host the WebAPI C# application, index.html, Javascript, Javascript libraries and other HTML on the one (or more) web roles.
(2) Host the index.html, Javascript, Javascript libraries and other HTML on a CDN and put the WebAPI C# application on one (or more) web roles at one location.
From a performance point of view are there likely to be any issues with the split solution (2) when I am using SSL. Is there anything that I should consider that might help improve the start-up time (my goal is for this to be as fast as possible).
One more question I have. If I were to use the Azure CDN then would I still be able to address the index of my web site as www.mywebsite.com and if using HTTPS would I need a SSL certificate?
Option 2 is more preferible.
You have to think, that your application is what sits in the backend. The front end is just a suggested set of UI controls and interactions to consume that application you have. Then, if you can host them separately you have some benefits, starting by not creating UI dependency.
The approach would be like creating a thin client.
Since the application is AngularJS based, probably all the UI are static files with HTML, CSS, and Javascript. You can host them in BLOB storage, and scale it through the CDN. You can have a custom domain name pointing to Azure Blob Storage, like `www.yourdomain.com. It has many benefits, including better price and scaling than web roles. Put aside, that you pay for web roles no matter if you are getting hits or not. The only downside is that as far as I know, it would not be possible to use HTTPS, but that should not be a problem, since you are just hosting static content and templates that contains placeholders, no actual data.
On Blob storage, you can attach your own cache control headers, allowing the browser to cache those files locally. Then a user would download those files once, and be recovered from the browser cache next times. Also, you can store the content already compressed in GZIP, and then set the content encoding property to let the browser know it is compressed, therefore enabling a faster content download. Not forget you should bundle your resources. For example, you should bundle all your JS code in one JS file, all your CSS code in one CSS file, and all your AngularJS views should be bundled in the template.js file (also bundled into the unique JS file).
You need to host your backend application in worker/web role instances though. Here you can use HTTPS, and it would be no problem to use AJAX over HTTPS, although the page loaded on HTTP as long the SSL/TLS certificate is signed by a CA recognized by the browser (ie: a valid certificate). If you use a self-signed certificate, there will be no way for the browser to prompt the user to accept it. Keep this in mind if you plan to start with a self-signed one.
So then you would have all the things that are not user/state dependant in blob storage, that is cheap, fast and highly scalable; and all your user data interaction would happen through your worker/web roles through compact data request/response probably in JSON. Therefore you need less web/worker roles for providing the same level of service.
Now, if you have a very asymmetrical amount of massive queries and data changes request, you should consider an approach like CQRS.

How to switch off Akamai caching for dynamic html files?

I run wordpress site and am using Akamai for caching. I have a link on every page so the user can switch between desktop and mobile site at any point. This link once clicked stores cookie which is passed to server with every request and so server knows if needs to return mobile site or desktop version.
Now when I access via "origin" it all works fine as it skips Akamai caching. However when accessing site as normal, so with Akamai caching, the link doesn't do anything. I'm assuming its because as far as Akamai is concerned its exactly the same url request and as Akamai has already its cached version it returns the same page ignoring the cookie all together.
Is there any way to tell akamai directly from my php files in wordpress not to cache html and do it only for images,css etc?
Or maybe is there a setting in Akamai itself where this can be specified?
If not then what other options would I have to get this working?
Yes there are a number of ways to do this. The easiest way would be to do a no cache on specific file extensions such as .html
You can tweak the files to be or not to be cached in AKAMAI through "Configuration Attributes and Digital Properties" screen.
On "Time To Live Rules", you can define path and their caching policy.
Apart from that if you want to validate if a particular web resource id rendered from AKAMAI or not, you can use Fiddler and a particular PRAGMA header.
Refer link Validate if web resource is served from AKAMAI (CDN)?? for more details.

Images from cdn, and problems with SSL

We have some images that are served up through a cdn (non ssl). During checkout process our site switches to ssl, and now we're getting warnings because the page contains unsecure elements.
Besides getting SSL on the CDN or moving all images to the secure domain, are there any work arounds for this?
Is it possible to do some thing like 'mirror' the images or something like download them and then serve them as they're requested?
Using ASP.net mvc
Best practice is to only load secure content in SSL.
Here are your options:
Get SSL on your CDN
Host your checkout-associated images somewhere with SSL (localhost or somewhere else)
Subvert your own SSL certificate
Option 3 is done by laundering the content from your CDN to the client using your webserver as an intermediary, using AJAX and a server-side script. Unfortunately there's no way to do that without adding a lot of HTTP requests and probably a forced delay on top of that (to make sure the images are stored before the client tries to load them).
That'll hurt your page load time pretty bad, and at that point you might as well just host the images on your webserver(s) since that's where they're being stored and loaded from at the end of your chain anyway.
Basically, there's no workaround. If any, it could be a severe security breach.
The best solution is to Enable SSL on the CDN, ideally with a URL that is compatible with the site's certificate.
the other alternatives, (copying files back, setting up a proxy-script) would obviously void all the benefits of the CDN.

How to cache external static content in ASP.NET MVC

While checking the page speed, Google Page speed suggested me to "Leverage Browser Caching". As I enabled the caching in my MVC application using this code in .config file.
<clientCache cacheControlMode="UseMaxAge" cacheControlMaxAge="7.00:00:00"/>
After this, caching for static content which comes from my own domain is being cached, yet it is working.
However, static resources which comes from external domains are not being cached.
For example:
mydomain.com/content/scripts/somescript.js --> BEING CACHED
http://widget.uservoice.com/ha3YmZucx5RAYmq2cS9qw.js --> NOT BEING CACHED
Google is still suggesting me to "Leverage Browser Caching" for that reason.
How can I enable my application to cache static resources which comes from external domains?
You can't cache static resources from third-party domain; that's not how things work. If there's third-party resources not being cached, that means that third-party has either directly chosen not to employ caching (it may need to be always up-to-date to function properly) or has neglected to implement caching. The only thing you can do about that is submit a ticket to the third-party and ask them to fix it.

Resources