what stops people from downloading any website - netflix

I just learned that you can actually download an entire website using programs like httrack or IDM, what stops people from using these programs to download the whole Netflix library for example, and never pay for a subscription, it shouldn't be that easy so can someone tell me what's the catch?

Movies and shows are basically stored in separate servers/departments, and downloading just the HTML would not give you access to any of the other files. Think of it as viewing the page source for any other website, like even StackOverFlow, you cannot view the CSS, javascript, or any other files of it. You are only able to see the HTML.
BTW as a heads up, this is not a quality question and does not meet the guidelines of StackOverFlow, I would suggest you ask these type of questions in the communities of StackOverflow.

I'm pretty sure movies and series are stored on different servers, downloading the HTML of a website doesn't give you access to their files.

Related

Is it possible to see all publicly-accessible files on a website?

I would like to query a website that provides files for download to see all the available files for download.
For example: webpage called https://download.website.com/path/to/file has a file of interest to me, but I would also like to see other available files available in the system publicly.
Essentially I would like to be able to view a hierarchy of all of the publicly-facing files given some parent link. So if I know I want all files stored under https://download.website.com/path/, the query would turn up a recursive list of available files from https://download.website.com/path/*.
Is this even possible to do for most websites? Would allowing this behavior be too compromising to web frameworks in general, so it might not exist? Am I XYing out of control?
Any help here greatly appreciated.
This method isn't perfect, but you can try it. Just put this query in Google Search.
You can do a Google search for some publicly available and indexed path.
For example if you want to search all available page on a website/domain:
site:download.website.com
If you want to search all PDF files in this site:
site:download.website.com filetype:pdf
If you want to search all links with path download.website.com/wp-content/:
site:download.website.com inurl:/wp-content/
I hope it will help you a little bit.

Corporate Intranet on IIS 7: Looking to Enhance Directory Listing

I am looking for the ability to enhance the appearance of the directory listing pages of a very basic corporate intranet I developed for use by our employees. I am using Windows Server 2012 R2, and the site is deployed in Internet Information Services.
I did not use anything like Visual Studio to create this, and I have already handwritten all of the CSS and HTML for the index page that serves as a jump-off point for the rest of the site so that users can get to the content they need; however, the directory listings pages where the users land leave quite a bit to be desired to say the least.
I just want to be able to add some quick styles to these directories, such as modifying the font family and perhaps the link styles. Nothing major, really. The site already functions perfectly for what it was designed to do, and has been for years. This is just something that's always kind of bugged me but I never devoted any time to it. I'd like to do that now. Ideally I'd be able to just add something in the web.config file like inline CSS, or perhaps link it to a .css file that will house the styles. The latter is probably preferred, actually, but any way is fine.
Any help is greatly appreciated. To get this out of the way early: yes, I have done lots and lots and lots of searching on this topic — I'm talking hours. I have not been able to find a solution that seems to meet my needs. I consider posting here as somewhat of a last resort because I understand that it's a free resource and users here are usually quick to let other users know when they didn't find a particular article that seems to offer the solution they're seeking — which is usually a result of not knowing exactly what keywords to use — and I don't want to waste anyone's time. Just know that I have tried everything I know to find the solution, and that I'm genuinely stumped and looking for help from some pros.
Thank you!
Since the directory page is not HTML, you can't directly style it with CSS. However, there are a few options for changing the way it looks.
Write a script to point to your own, custom-styled, directory page. See this forum thread for tips on how to do that and a sample script.
Create a custom page using this module that you can further customize yourself.
Use the DirectoryListing open source app, which allows for customization of the directory page.
Either one of those solutions should give you more control over how the directory page looks.

Suggestions for deciding on a WCMS for a hockey website?

I need to make a website for my hockey club. My main purpose for this site is allowing people to sign in and post articles and training schedules in their section. Eg Mens, Womens, Juniors and Masters. I want to have some kind of upload manager that will allow them to choose where they post the info too (eg, Mens, Masters and Homepage).
This is the main functionality I'm looking for at the moment.
The clubs previous website used Joombla which I have hated. I found it to be way to restrictive. Its on a old version of it so there are probably many improvements in the new version but from what I've read it seems like it still has a lot of restrictions in how content is managed. I am open to trying it again tho.
I've used Wordpress before and liked it but that was on a small scale projects and I'm not sure it really fits what I'll be trying to do here, since it mostly deals with blog posts and I'll need to have functionality to upload and display files.
I've had a look around at some other ones like Squarespace and Silverstripe. I'm really liking the simplicity of silverstrip(one thing I hate about Joombla is the clutter on the opening page) and am leaning towards it right now if I can find a nice way to have people post news to multiple pages at once.
If anyone has any suggestions they'd be very welcome. I know html, css, javascript and a bit of php. I'm learning Ruby atm so wouldn't be against using it so I could learn more but it might be a bit much for a sports website.
First off, its nice to see someone that likes hockey too :) You can't use Squarespace, you'll need an Apache server for what you want. You will need some way to store information, so you'll need a MySQL database, probably some advanced knowledge of PHP (I'm assuming you don't know how to connect to databases and do some other functions). Wordpress is too limited, so you can't use that. I have never used Silverstripe personally, but it seems like the best of your options here. You'll probably need some more knowledge of PHP before you attempt to make a members system.

dashcode and external rss feeds

I was wonder if anyone can help me with this. I've been looking everywhere for this information, but I want to make a web application using dascode rss. I know that you can't link external sources. Does anyone know a way I can get around it. From what I understand a little php can get around this, but I'm unsure where to look.
OK, first thing no PHP. Dashcode is limited to HTML, CSS and JavaScript. Although having said that there are a whole range of system calls that cna be made using the functionality provided by various parts of the x-cde system.
Second yes you can link to external sources such as other web sites, api on say Twitter, google etc. RSS feeds and so on, not sure where you got the idea to the contrary.
If you want to learn how to do a Dashcode RSS then open up Dashcode, start a new project, either web based or Dsashboard based and choose the RSS project. This will give you an out of the box template to add you own information and then see how it works. Then customise it.
In the above i am assuming Snow Leopard and the latest Dashcode/X-code but it will still gove you most of what you want on earlier version.

Automated link-checker for system testing [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I often have to work with fragile legacy websites that break in unexpected ways when logic or configuration are updated.
I don't have the time or knowledge of the system needed to create a Selenium script. Besides, I don't want to check a specific use case - I want to verify every link and page on the site.
I would like to create an automated system test that will spider through a site and check for broken links and crashes. Ideally, there would be a tool that I could use to achieve this. It should have as many as possible of the following features, in descending order of priority:
Triggered via script
Does not require human interaction
Follows all links including anchor tags and links to CSS and js files
Produces a log of all found 404s, 500s etc.
Can be deployed locally to check sites on intranets
Supports cookie/form-based authentication
Free/Open source
There are many partial solutions out there, like FitNesse, Firefox's LinkChecker and the W3C link checker, but none of them do everything I need.
I would like to use this test with projects using a range of technologies and platforms, so the more portable the solution the better.
I realise this is no substitute for proper system testing, but it would be very useful if I had a convenient and automatable way of verifying that no part of the site was obviously broken.
We use and really like Linkchecker:
http://wummel.github.io/linkchecker/
It's open-source, Python, command-line, internally deployable, and outputs to a variety of formats. The developer has been very helpful when we've contacted him with issues.
We have a Ruby script that queries our database of internal websites, kicks off LinkChecker with appropriate parameters for each site, and parses the XML that LinkChecker gives us to create a custom error report for each site in our CMS.
I use Xenu's Link Sleuth for this sort of thing. Quickly check for no deadlinks etc. on a/any site. Just point it at any URI and it'll spider all links on that site.
Desription from site:
Xenu's Link Sleuth (TM) checks Web
sites for broken links. Link
verification is done on "normal"
links, images, frames, plug-ins,
backgrounds, local image maps, style
sheets, scripts and java applets. It
displays a continously updated list of
URLs which you can sort by different
criteria. A report can be produced at
any time.
It meets all you're requirements apart from being scriptable as it's a windows app that requires manually starting.
What part of your list does the W3C link checker not meet? That would be the one I would use.
Alternatively, twill (python-based) is an interesting little language for this kind of thing. It has a link checker module but I don't think it works recursively, so that's not so good for spidering. But you could modify it if you're comfortable with that. And I could be wrong, there might be a recursive option. Worth checking out, anyway.
You might want to try using wget for this. It can spider a site including the "page requisites" (i.e. files) and can be configured to log errors. I don't know if it will have enough information for you but it's Free and available on Windows (cygwin) as well as unix.
InSite is a commercial program that seems to do what you want (haven't used it).
If I was in your shoes, I'd probably write this sort of spider myself...
I'm not sure that it supports form authentication but it will handle cookies if you can get it going on the site and otherwise I think Checkbot will do everything on your list. I've used as a step in build process before to check that nothing broken on a site. There's an example output on the website.
I have always liked linklint for checking links on a site. However, I don't think it meets all your criteria, particularly the aspects that may be JavaScript dependent. I also think it will miss the images called from inside CSS.
But for spidering all anchors, it works great.
Try SortSite. It's not free, but seems to do everything you need and more.
Alternatively, PowerMapper from the same company has a similar-but-different approach. The latter will give you less information about detailed optimisation of your pages, but will still identify any broken links, etc.
Disclaimer: I have a financial interest in the company that makes these products.
Try http://www.thelinkchecker.com it is an online application that checks number of outgoing links, page rank , anchor, number of outgoing links. I think this is the solution you need.

Resources