Why torrent can't download some files via HTTP? - http

There is a torrent file in which the web seed is configured. Most files are loaded normally, but when downloading some files (text\american.ini) the connection to the server suddenly terminates and the download stops. This can be checked if, when adding a torrent, select only this file to download. At the same time, this file is loaded normally from the browser. Because of what it can be? Tested on uTorrent and libtorrent.
Here you can download the torrent file and check it personally.
Download

There are two different kinds of Web Seeds, BEP 19 and BEP 17, and one assumes that the server is configured to handle working with torrent clients, your torrent has a BEP 19 link that's supposed to point to a file or a directory that has the same name as the torrent and that directory should contain the files in the torrent.
Your torrent name looks like this:
files/licence.txt
and your Web Seed looks like this:
https://website.com/projects/crmp/
It's not working because the Web Seed URL is wrong.

The problem was that FileZilla, when downloading some files to FTP, changed them and the torrent considered these files different. Solution: change the transfer mode from ASCII to binary.

Related

WebDAV/HTTP 1.1 Client server calls

I am beginner to WebDAV/HTTP protocol. I want to transfer files across network using WebDAV. The source received camera image raw data files has to be placed in a webDAV server (a folder). These files are then copied by a remote client via webDAV. my questions below
Is it necessary to use webDAV/HTTP GET/POST/PUT method calls on the server machine to copy images
or just a normal Linux/QNX command shall work?
If we do not need to go via WebDAV just to copy files then how can we attach properties to these
files like file name and size? are these automatically supplied to remote client by webDAV using
some OS file system calls?
Thank you in advance for your answers

HttpClient.PostAsync for uploading files

I am working on an API that needs to download a file from server A and upload it to server B in the same network. It's for internal use. Each of the files will have multiple versions and will need to be uploaded to server B multiple times and all the versions of the same file will share the same file name. This is my first time dealing with file manipulation so please bare with me if my question sounds ignorant. Can I use HttpClient.PostAsync for the uploading part in this effort? Or can I just use Stream.CopyToAsync if it's ok to just copy over? Thanks!
Stream.CopyToAsync copies stream from one to another inside memory of same server.
In your case, you can use HttpClient.PostAsync, but one the other server there should be some api to receive the stream content and save to disk.

How to add web folder to chef's private nginx?

In my Chef recipes I use remote_file to access files at http server. I would like to use Chef server's ngnix to serve those files.
How to configure Chef's nginx to serve files in specific folder using http?
This is not a feature of Chef. You can use the files/ folder in a cookbook and the cookbook_file resource to serve files directly from the Chef Server but it is very limited and you should really use your own server to manage large files or complex downloads.
We have many machines in LAN to be configured with Chef. There are also many remote_file resources in our recipes. The problem was: downloading from internet takes a lot of time, but at the same time the file was available on another machine in LAN.
What I did is:
Installed Nexus for binary file storage.
Monkeypatched remote_file resource using a library in a cookbook, so that it gets loaded every time a cookbook is loaded overwriting original resource behaviour.
Now every time remote_file resource has to download some file, it will first ask Nexus. If that file is available in Nexus, it downloads from there.
If there is no such file there, Chef downloads the file from original source and then uploads it to Nexus too for other nodes.

Can't open a windows 10 encrypted file transferred to another PC because of the .PFILE extension

Objective: Using Windows 10 functionality, send an encrypted folder of files to a remote pc and allow the files to be decrypted for use on the remote pc.
(updated after initial post)
I have studied this for a bit and I'm not having success. I did this using a folder with 3 .pdf files for simplicity to confirm I understand the process - I don't. I followed these steps:
1) Right-clicked on folder > Properties > Advanced > checked 'Encrypt contents...' >
OK'd my way out when the encryption was complete
The folder and file icons show a lock symbol on them and the filename and extension remained the same as before.
2) Exported the encryption certificate to a .pfx
3) Imported the exported .pfx from the previous step into the remote pc
4) Transferred the file to the remote PC using an SD card
The filename now has another extension added to it: '.PFILE'
I can't get past this step
Also, when I go into Properties > Advanced on the folder or contained files, instead of seeing the 'Encrypt Contents...' checkbox checked, it is unchecked.
The .PFILE extension is part of the Microsoft Rights Management service. It is not present when the file is on the encrypting machine.
I confirmed that the encryption process is effective on the encrypting machine by signing into another account on the same pc; that account could not open encrypted files until I imported the .pfx certificate into that user account. Again to confirm, those files in the 2nd user account on the encrypting pc do not have the .PFILE extension. The .PFILE extension only shows up on the remote pc.
So the issue is I don't understand is why .PFILE shows up on the remote PC and what do I do about it?
Added after yet more testing: When I copy an encrypted folder from the encrypting pc to another pc on my network using the network connection, things work OK. I.E., the copied folder and its internal files show up on the other pc as encrypted in the manner I was looking for. However, this does not solve my problem exactly.
I would like to transfer these files to a PC that is not on my network. I don't seem to be able to copy the encrypted folder to an SD card, nor can I send the encrypted files in a transmission to the remote PC. I'm guessing 'that's just the way it is', so I don't know how to accomplish my objective as stated above.
Additional Results: I used BitLocker on my SD card and achieved a make-do solution for my need.
However, I would like to understand if it is possible and how it can be done to open encrypted files when the extension .PFILE has been appended to the file on a different computer, or am I misunderstanding a fundamental aspect of encryption?
PFILE extensions in this case are an artifact of windows 10 encryption - encrypted file system for sure, maybe bitlocker as well. It happens when you copy a Win10 encrypted file onto a file system that can't handle encryption (e.g., exFAT).
Those PFILEs can be opened on the original PC, and perhaps on another properly configured Win10 box (don't forget to install the same keys/certs on it).
Bun on systems prior to Win10, I don't have any confidence these PFILES can ever be opened.
How to solve? The two not-so-useful answers:
Don't try to store encrypted files on that FAT (or other "old" file system) disk.
Copy the EFS file onto a file system that does support EFS (e.g., NTFS).
I have yet to find out how it would be possible to open a PFILE on a non-Win10 box, even though my target machines have all the right keys to open standard EFS files on an NTFS file system. Documentation is virtually non-existent (like you expected anything else).

Deploying source to web server with deleting not needed files

When developing for asp.net using visual studio for web, it is really convenient that when you deploy your website to the web server, it will cleverly check which files have been changed and only upload those files. In addition, if you deleted some files from your source, it detects that too and deletes those files from the web server since they are no longer needed.
I started developing with the LAMP stack and am wondering how you can deploy to a web server in a similar way.
I tried using Filezilla and on copy/pasting the source files to the web server, you have these options if there are similar files:
-Overwrite
-Overwrite if source is newer
-Overwrite if different size
-Overwrite if different size or source newer
"Overwrite if source is newer" works, kind of, but it only checks the date modified, not the content of the file. Also, the above method does not delete files from the web server that were deleted from the source.
Is there a better way to do this with Filezilla? (or maybe use some other program?)
Thanks.
You can use rsync to accomplish this.
When you want to push out changes you would do something like this form your production server.
rysnc -av user#<developmentIp>:/web/root/* /production/web/root/
The pattern is rsync --flags [user#host:]/source/dir [user#host:]/destination/dir
You only need the user#host stuff for remote hosts. The user must have ssh access to the host.
Couple little suggestions.
The command can be run from the source or destination. I find it better to run the command from the destination, for permissions issues (i.e. your reading from the remote and writing to the local)
Do some tests first, I always mix up the directory stuff; do I need the end slash, should I use the star, ...
Read the man page, there are alot of available options that may be helpful (--delete, --exclude, -a)

Resources