download directory and sub directories off a virtual repository - recursion

Wondering if there is a way to download the root folder plus a bunch of sub folders (and sub folders of those folders) with all the files and keep them in their respective folders.
I've tried some firefox plugins like flashgot and download-them-all but they grab the actual web files in addition to the files in the repository, but only if they are visible. For example, if I don't collapse all the folders and expose the files in the repository, the plugins won't detect them.
I would just collapse all the folders and expose the files but these plugins won't recognize the folders...they just download as "foldername".html .... and all the files are mixed together in one folder.
I've also tried visualWget and allowed recursive downloads but again, this only grabs the actual website files, not the files in the repository.
If anyone could help it'd be greatly appreciated. I've been copying them manually but there are literally thousands of files and folders so I'm looking for a quicker solution.

As a client you can only download what's accessible. You either need to know the list of files or crawl the pages for the links, which is what the Firefox plugins do.
There's no way to get a list of files on the server without access to the server beyond http (unless the server has webdav or exposes some other api).

I ended up getting it to work. I used the following command in Terminal.
scp -r username#hostaddress:/file/path/to/directory /path/to/my/computer/directory
-r is for recursive so it downloads all files and directories and subdirectors
If you try this be sure to run this command from your local terminal. I made the mistake of doing it from the SSH connection to the server (no negative effects just frustrating)

Related

is it possible to find hidden files on website?

If I'm hosting a website, say at http://www.example.com, how can I find files that are in the same folder as index.html if I DON'T know the filenames?
So for example, if there are these files in there:
http://www.example.com/test.txt
http://www.example.com/test1.txt
Can anyone see this list of files? If so, how can I hide them, but make each one accessible to someone who knows the names? I don't want to use a password system, if possible.
If you put index.html in that directory, so no files will be listed. If you allow to upload to that directory, so i can upload some php script to list all files in directory. IF you don't know file name, you can try to guess it :)
You can use bruteforce tools such as dirbuster, or you can look at the "/robots.txt" file for some clues on what's on the website.
By the way, you should keep in mind that most of web servers nowadays have rights management implemented, so even if there is such file on the server it may not allow you access to it without authentication.
Some hosting providers provide an option to specify whether directory listings are allowed. If enabled, and a client requests a URL for a folder that does not contain a default HTML file (index.html, default.html, default.aspx, etc), then the web server will serve up an HTML file containing a listing of the files in that folder. It is rare that this option is ever enabled, though.
but you should be accurate while inserting names of files in URL or else you can use pen-test tool which will list out some of the names for free.for attempts ot full listing of files you'll need to be a paid member i guess

Checking Wordpress core files

Is there a script or something that can check if all core files are installed properly. I am installing a Wordpress site on clients hosting, and for some reason around 100 files were not transferred due to the connection time out. Now I am moving them one by one, but still I would like to check somehow, once I am done, that all files transferred are there and their size is more than 0b.
Thanks.
Since you are using Filezilla, drag and drop all files again into the folder.
Then when the file exists message shows up, pick Overwrite if different size and check apply to current queue only. Then only the ones with different sizes (or the ones that weren't transferred) will be overwritten/updated.
There's an easier way:
If you have access to some kind of control panel like cPanel, you can make a .zip file and upload it only via Filezilla.
Then on cPanel, go to File Explorer and unzip from there. Will be faster and you just have to upload one file (rather than opening tons of connections and giving you timeout).
Or if you have shell access, you can login with your key using Terminal(mac) or Putty(win), browse the folder and run the unzip command.

Backing up ASP.net website code files - to a backup folder under the website folder

I want to backup my existing ASP.net web app before updating it.
Therefore I create a backup folder inside the website (ie same level as App_Code, web.config). Call it something like Backup_20110910
Then I move all the current website files/folders (excluding web.config, app_data) into the backup folder.
Then I extract the zip of the latest code in the now clean folder.
Is there any potential problems with this approach? As after all, you are increasing the number of csharp files in your website folder, could there be conflicts etc.
I wouldn't back up within the folder structure, there's a possibility that someone then finds your backup folders and browses to them, running the older code. If you zip it then you suddenly have files someone can download too. Even more amusingly if, as a lot of people do, when you change web.config you rename the old one to web.config.bak a lot of security scanners look for that because now it can be downloaded, as it's no longer a .config file, but a .bak.
Backup outside the web root, not within and all of those worries will go away.
There won't be an issue - except that it might become confusing to have identical folder structures within the current folder structure - it's always wisest to keep backups completly seperate from the current build

Nautilus script: $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS empty for WebDAV folders

When writing a Nautilus script, $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS gives the path to the file whose context menu has been clicked, for instance /home/nico/test.txt.
But when the file is within a WebDAV share, the variable is empty.
Is it a bug?
How to get the path for a WebDAV file?
My script is intended to be used for files on WebDAV shares.
I have just found this list of variables:
https://help.ubuntu.com/community/NautilusScriptsHowto
The one I was looking for is $NAUTILUS_SCRIPT_SELECTED_URIS, it works on WebDAV too, returning for instance dav://admin#localhost:8080/alfresco/webdav/User%20Homes/leo/test.txt
Nautilus' $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS is only for LOCAL (mounted) files, and by design is blank for remote files, like $1, $2...
For REMOTE files, like WebDAV, or Samba network shares, FTP servers, (or any other location where $NAUTILUS_SCRIPT_CURRENT_URI is not like file://...), use $NAUTILUS_SCRIPT_SELECTED_URIS

How to usually .htaccess file when it is hidden in ftp

I usually cannot see .htaccess file because it is hidden, when I login to remote servers with ftp access.
Since I don't have shell access, I usually perform the following steps to edit the file:
I change the settings on my mac (from terminal) to see invisible files
I open .htaccess file on a standard drupal installation and I edit it
I upload it to the remote server and I overwrite the existent one
I disable hidden files on my mac
I was wondering if there is a faster solution
thanks
I often have a separate file in my Drupal root called production.htaccess or something along those lines. Not only does this expose the file in Finder without revealing every single .DS_Store on my system, it also allows me to set separate .htaccess directives for different environments. Then, I just rename production.htaccess to .htaccess after I upload it to the server.
More often than not, the two .htaccess files are identical, but even in that case, I still use this method for the sake of convenience.
The FTP application should have the option to show you hidden files; normally, that is an option available on FTP client applications for Mac OS X.

Resources