How cloudfront works? [closed] - cdn

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm planning to Implement CDN(Content Delivery Network) of Amazon which is known as CloudFront in ASP.NET MVC3 with c#.
I've googled about it but little bit confuse about few things mentions below.
Is it compulsory that we have to uploads all static resources to CDN Network first and then we can use or Is it manageable by Amazon to crawl site static resources which is predefine folder or directory of sites?
Is Amazon automatic update its copies when we anything change in static resources or every time we have to upload updated resources to CDN network.

CloudFront is basically a cache. When a resource is first accessed it contacts your origin servers for a copy, you don't need to preload anything. If you are serving static resources the best way is to give it an S3 bucket with the resources in.
If your origin servers set HTTP cache control headers then CloudFront will use them to determine how often to check for updated files. Otherwise you can set a default timeout in the CloudFront settings. Here is Amazon's documentation.

Related

Will web scraping only cause harm to those who have a website? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
Today I scrape a website using beautifulsoup4 and try to fetch about 16.000 data on that site.
And just few minutes after that, that site got down and can't access about few hours.
So.. my question is:
Will web scraping only cause harm to those who have a website?
First of all, it is advisable to check out the robots.txt file of every site before bombarding it with automated requests like you just did. It is not good for the website owner as well as for you. In order to scrape a website, follow these steps before starting to write a web scraper:
Check if the website has an API available already to make your task easy. If not, then go to step 2.
Check out the robots.txt file which is present at www.anywebsite.com/robots.txt. If the owner has listed this page (which in most cases he will), you can see whether robots are allowed to access the website or not. If yes, then check out which pages are disallowed and also check if there are any rate-limits for it.
If the robots.txt file is not present, then make sure you are gentle enough not to shoot requests to the website at bullet-speed. It might cause harm to the owner and you might get blocked forever from accessing the site.

WordPress horizontal scaling: how to share files across servers? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
So in a project i'm working on i'm trying to horizontally scale wordpress, my actual stack idea is :
HA Proxy as a load balancer
3 webservers behind the load balancer running Nginx/ PHP7
1 Redis Server
1 or more MySql servers to make sure everything is ok for high availability
The issue comes to my mind when I think about file uploads, if a user tries to upload a picture to wordpress, the picture will be available only in the Nginx/PHP VPS the loadbalancer gave him.
My question would be something like :
How can I centralize all uploads ? Like using a "shared" wp-content folder ... I've read about GlusterFS and Ceph, will these be usefull ?
Rather than solve this strictly at the backend, I'd suggest you first consider something like CloudFlare in front of your WordPress site. You could setup caching on the upload directory... You're going to get enormous horizontal scalability out of that. And it's basically free and pretty easy to setup. We've got CloudFlare in front of a site serving over 500,000 page views a day and you'd be shocked at how light the load is on that server.
Beyond that, if you do put a load balancer up in front of your site, you should be able to have it route traffic from the same user to the same backend node, so it should be consistent for that user for the duration of their browser session. That'll give you time for a file synchronization tool of some kind to keep all of your balanced nodes in sync. You might look at https://github.com/bcpierce00/unison for this.

Asp.Net MVC Site with more Images - Need CDN recommendation [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I currently have a Travel site built using Asp.Net MVC. The site have many articles and albums with lot of images asssociated with it. Currently, we are uploading the images locally to a folder and linking it in the article content and similarly for the albums too. Since the number of images are growing day by day, there are more loads on the webserver for every request which downloads lots of images.
I have seen other sites where they do similar thing by referring the images alone from a subdomain and some using CDN.
I am currently having shared hosting site plan and i need to reduce the stress to webserver by rendering the image from elsewhere.
Is there any CDN recommendation that is not so costly but can integrate well with my ASP.Net MVC site? I want to only upload the image to a CDN and link the image from CDN instead of local folder from the article posting page.
If CDN is not the option, can anyone suggest something else to serve my purpose?
Thanks in advance!
Yes, You need to move content to CDN. Because CDN provides very huge speed over network and it will reduce traffic to your web server.
There are very high range of CDN's available in market. I would suggest some of those for your purpose.
CDN77
Azure
MAXCDN
Akamai
From all above, CDN77 and Azure are less costly than others as per my experience.
and also you should try CDN77 14days free trial for testing purpose.
Thanks,
Hayat S.

Service is down status [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
What is the best practice to inform users if the services is down? In my example it's an application upload function that may be over capacity.
Thanks.
Uli
There are many ways that this can be accomplished and it all depends on what best fits your particular application.
If all you are looking to do is disable a feature, like in your example of disabling the upload feature, you could put a prominent message at the top of that page that says that it is not currently available and then disable the upload button on that page.
If you plan to take down the whole site for maintenance, it's good to have it as a separate page that is not linked to the rest of your site in any way. That way you can modify anything within your site, and also make a lot of changes to your web server.
Have an error page that is pure HTML - that way if the database goes down, you are not showing an error page that is dependant on some kind of DB query. Otherwise, your error page, will error.
You could return an HTTP error code if it is a web/http upload endpoint.
In our company, we have a procedure that do something like that (on our apache server):
Stop a particoular service
Enter .htaccess and do a rewrite-url to a standard down page
When we came back:
Enter .htaccess and remove the rewrite-url
Restart services
This is done by a bash script called when we want to shut down some services.
You could do that with a demon that check if server is "overlogged" and starts that routine.

Host Wordpress blog and static websites on Amazon S3 [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
sIs there a way I can host my wordpress blog and static website on Amazon s3 ?
As a matter of fact, maybe. They just announced this last month. That blog post has all of the relevant details, though it should be noted that it's intended for static content, not the dynamic content of a full Wordpress installation. Consider using a different blogging engine, one that produces static HTML.
Amazon S3 won't run a Wordpress blog. There could be some plugins to generate a static website out of a running installation of Wordpress, but I won't recommend this.

Resources