How to add web folder to chef's private nginx? - nginx

In my Chef recipes I use remote_file to access files at http server. I would like to use Chef server's ngnix to serve those files.
How to configure Chef's nginx to serve files in specific folder using http?

This is not a feature of Chef. You can use the files/ folder in a cookbook and the cookbook_file resource to serve files directly from the Chef Server but it is very limited and you should really use your own server to manage large files or complex downloads.

We have many machines in LAN to be configured with Chef. There are also many remote_file resources in our recipes. The problem was: downloading from internet takes a lot of time, but at the same time the file was available on another machine in LAN.
What I did is:
Installed Nexus for binary file storage.
Monkeypatched remote_file resource using a library in a cookbook, so that it gets loaded every time a cookbook is loaded overwriting original resource behaviour.
Now every time remote_file resource has to download some file, it will first ask Nexus. If that file is available in Nexus, it downloads from there.
If there is no such file there, Chef downloads the file from original source and then uploads it to Nexus too for other nodes.

Related

rdiff-backup-like storage on Artifactory

I am looking for a way to store files in Artifactory repository in a storage efficient way and upload/download difference between local version and remote in order to save disk space, bandwidth and time.
There are two good utilities which works in this way rsync and rdiff-backup. Sure there are others.
Is there a way to organize something similar with Artifactory stack?
What is rsync:
DESCRIPTION
Rsync is a fast and extraordinarily versatile file copying tool. It can copy locally,
to/from another host over any remote shell, or to/from a remote rsync daemon. It offers
a large number of options that control every aspect of its behavior and permit very
flexible specification of the set of files to be copied. It is famous for its
delta-transfer algorithm, which reduces the amount of data sent over the network by
sending only the differences between the source files and the existing files in the des-
tination. Rsync is widely used for backups and mirroring and as an improved copy com-
mand for everyday use.
JFrog CLI includes a functionality called "Sync Deletes", allowing to sync files between the local file system and Artifactory.
This functionality is supported by both the "jfrog rt upload" and "jfrog rt download" commands. Both commands accept the optional --sync-deletes flag.
When uploading, the value of this flag specofies a path in Artifactory, under which to sync the files after the upload. After the upload, this path will include only the files uploaded during this upload operation. The other files under this path will be deleted.
The same goes for downloading, but this time, the value of the --sync-deletes flag specifies a path in the local file system, under which files which had not been downloaded from Artifactory are deleted.
Read more this in the following link:
https://www.jfrog.com/confluence/display/CLI/CLI+for+JFrog+Artifactory

Why torrent can't download some files via HTTP?

There is a torrent file in which the web seed is configured. Most files are loaded normally, but when downloading some files (text\american.ini) the connection to the server suddenly terminates and the download stops. This can be checked if, when adding a torrent, select only this file to download. At the same time, this file is loaded normally from the browser. Because of what it can be? Tested on uTorrent and libtorrent.
Here you can download the torrent file and check it personally.
Download
There are two different kinds of Web Seeds, BEP 19 and BEP 17, and one assumes that the server is configured to handle working with torrent clients, your torrent has a BEP 19 link that's supposed to point to a file or a directory that has the same name as the torrent and that directory should contain the files in the torrent.
Your torrent name looks like this:
files/licence.txt
and your Web Seed looks like this:
https://website.com/projects/crmp/
It's not working because the Web Seed URL is wrong.
The problem was that FileZilla, when downloading some files to FTP, changed them and the torrent considered these files different. Solution: change the transfer mode from ASCII to binary.

File transfer between local folder and remote URL using WebDAV

I want to move files from local folder to remote URL in a scheduled way using WebDAV.
I found this URL useful but this shows the script to transfer only single file, instead I want to transfer all files from a local folder to remote URL through winscp using WebDAV protocol:
http://winscp.net/eng/docs/guide_automation
Any pointers for this would be helpful.
Use the following WinSCP script:
open http://user:password#example.com/
put d:\path\* /home/user/
close
Read about file masks.
If you really need to move files (as opposite to copy), add the -delete switch to the put command:
put -delete d:\path\* /home/user/
For scheduling, see the WinSCP guide to scheduling file transfers.

Deploying source to web server with deleting not needed files

When developing for asp.net using visual studio for web, it is really convenient that when you deploy your website to the web server, it will cleverly check which files have been changed and only upload those files. In addition, if you deleted some files from your source, it detects that too and deletes those files from the web server since they are no longer needed.
I started developing with the LAMP stack and am wondering how you can deploy to a web server in a similar way.
I tried using Filezilla and on copy/pasting the source files to the web server, you have these options if there are similar files:
-Overwrite
-Overwrite if source is newer
-Overwrite if different size
-Overwrite if different size or source newer
"Overwrite if source is newer" works, kind of, but it only checks the date modified, not the content of the file. Also, the above method does not delete files from the web server that were deleted from the source.
Is there a better way to do this with Filezilla? (or maybe use some other program?)
Thanks.
You can use rsync to accomplish this.
When you want to push out changes you would do something like this form your production server.
rysnc -av user#<developmentIp>:/web/root/* /production/web/root/
The pattern is rsync --flags [user#host:]/source/dir [user#host:]/destination/dir
You only need the user#host stuff for remote hosts. The user must have ssh access to the host.
Couple little suggestions.
The command can be run from the source or destination. I find it better to run the command from the destination, for permissions issues (i.e. your reading from the remote and writing to the local)
Do some tests first, I always mix up the directory stuff; do I need the end slash, should I use the star, ...
Read the man page, there are alot of available options that may be helpful (--delete, --exclude, -a)

How to download artifacts from Artifactory Server

We have a bunch of jars in the artifactory server (the free version).
How can we download all the jars in a single http request from it ?
Do we need to tar all the jars into a single tar file in order to efficiently download the jars ?
Thanks
Sincerely
Since you are the one who generates the files, you have two options:
As you said, generate the tar before uploading it. You'll still be able to search and browse the files inside it.
Write a afterDownloadError user plugin. Each time the user tries to access a url with .tar in the end, create the tar from the needed files and serve it.

Resources