NextCloud: how to upload a large file via a shared link - webdav

I am trying to find a way to enable large file uploads (>> 1GB) on shared upload-enabled links. Direct uploading of small files via Webdav works here, when one takes the last path of the link as http user name.
However, the file size is too large to do this within a single chunk (and our admin will not increase the maximum upload size).
I found two ways to do chunked uploads programmatically:
the OwnCloud one:
use a specific file name <path/filename>-chunking-<transferid>-<chunkcount>-<index> for each path
the NextCloud one:
manually create a directory within the url <server>/remote.php/dav/uploads/<userid> (http MKCOL request)
upload the chunks there with http PUT to that directory
moving/concatenating the contents with MOVE of a special file .file in that directory
However, these two chunked methods don't work with shared links. So, is there any way to upload large files to a NextCloud server via shared links?
I want to do this on the shell, i.e. with curl or Python or so.

Related

How to download files from SFTP that doesn't have actural files in Synapse?

I have a little bit complicate situation here:
I need to download files from a SFTP daily. I connect to the SFTP with username and SSH key, the keys have a passphrase.
This SFTP has no actual files. All the files on the server is 0 bytes. The server will dynamicly generate the file if it get a "get" command.
So when I connect the SFTP with Winscp, everything went perfectly.
But I have to do it in Synapse.
I managed to connect it in Pipeline with copy activity, and I managed to download all the files, but with no data content inside.
Does anyone know how I can download the files with content?
If you actually have files with content in SFTP location, then they should also be automatically copied using the pipeline in your Synapse. In case if you just want to copy the files that are having the content and ignore empty files, then you will have to use a get metadata activity to check the size of the file (i.e., > 0 bytes) and then filter those files only to copy to your desired destination. Using the childItems you can get the fileName, Type and Size and use these properties in the subsequent copy activity to only copy filter files to your destination.

Force file download in a browser using ASP.Net MVC when the file is located on a different server without downloading it on my server first

Here's what I would like to accomplish:
I have a file stored in Windows Azure Blob Storage (or for that matter any file which is not on my web server but accessible via a URL).
I want to force download a file without actually downloading the file on my web server first i.e. browser should automatically fetch the file from this external URL and prompts the user to download it.
Possible Solutions Explored:
Here's what I have explored so far (and why they won't work):
Using something like FileContentResult as described here Returning a file to View/Download in ASP.NET MVC to download the file. This solution would require me to fetch the contents on my server and then stream from my server to the browser. For this reason this solution won't work.
Using HTML 5 download attribute: HTML 5 download attribute would have worked perfectly fine however the problem is that while it is really a very neat solution, it is not supported in all browsers.
Changing the file's content type: Another thing I could do (at least for the files that I own) to change the content type property of the file to something that the browser wouldn't understand and thus would be forced to download the file. This might work in some browsers however not in all as IE is smart enough to go beyond the content type and sees the file's content to determine the content type. Furthermore if I don't own the files, then I won't have access to changing the content type of the file.
Simply put, in my controller action I should be able to specify the URL of the file and somehow browser should force download the file.
Is this something which can be accomplished? If yes, then any ideas how I could accomplish this?
Simply put, in my controller action I should be able to specify the URL of the file and somehow browser should force download the file [without exposing the URL of the file to the client].
You can't. If the final URL is to remain hidden, your server must serve the data, so your server must download the file from the URL.
Your client can't download a file it can't get the URL to.
You can create file transfer WCF service (REST) which will stream your content from blob storage or from other sources through your file managers to client browser directly by URL.
https://{service}/FileTransfer/DownloadFile/{id, synonym, filename etc}
Blob path won't be exposed, web application will be free from file transfer issues.

Creating a new file without using a ServletContext

Assume I want to write to a new file created within the space of my webapp.
One way would be use getServletContext().getRealPath("/") and use that String to create a new file on the server. However, often I come across advice like not using getServletContext().getRealPath("/").
Can someone please let me know of another way to write a new file within my webapp?
Many thanks.
Have some configuration property containing the absolute path of a directory outside of the webapp and web server path, read this path from the configuration property, and write to this directory.
To serve files from this directory, write a servlet that takes the name or ID of this file as parameter, reads the file from the directory, and sends its content to the response.
This will
work even if the app is deployed as a war file and never unzipped to the file system
allow you to redeploy the next version of the app or server without deleting all the uploaded/created files
allow you to add whatever control you want on the uploaded/created files, instead of making them available to everyone
In short, treat these files as data, stored in a database which happens to be the file system instead of a SQL database.

Checking Wordpress core files

Is there a script or something that can check if all core files are installed properly. I am installing a Wordpress site on clients hosting, and for some reason around 100 files were not transferred due to the connection time out. Now I am moving them one by one, but still I would like to check somehow, once I am done, that all files transferred are there and their size is more than 0b.
Thanks.
Since you are using Filezilla, drag and drop all files again into the folder.
Then when the file exists message shows up, pick Overwrite if different size and check apply to current queue only. Then only the ones with different sizes (or the ones that weren't transferred) will be overwritten/updated.
There's an easier way:
If you have access to some kind of control panel like cPanel, you can make a .zip file and upload it only via Filezilla.
Then on cPanel, go to File Explorer and unzip from there. Will be faster and you just have to upload one file (rather than opening tons of connections and giving you timeout).
Or if you have shell access, you can login with your key using Terminal(mac) or Putty(win), browse the folder and run the unzip command.

Nautilus script: $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS empty for WebDAV folders

When writing a Nautilus script, $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS gives the path to the file whose context menu has been clicked, for instance /home/nico/test.txt.
But when the file is within a WebDAV share, the variable is empty.
Is it a bug?
How to get the path for a WebDAV file?
My script is intended to be used for files on WebDAV shares.
I have just found this list of variables:
https://help.ubuntu.com/community/NautilusScriptsHowto
The one I was looking for is $NAUTILUS_SCRIPT_SELECTED_URIS, it works on WebDAV too, returning for instance dav://admin#localhost:8080/alfresco/webdav/User%20Homes/leo/test.txt
Nautilus' $NAUTILUS_SCRIPT_SELECTED_FILE_PATHS is only for LOCAL (mounted) files, and by design is blank for remote files, like $1, $2...
For REMOTE files, like WebDAV, or Samba network shares, FTP servers, (or any other location where $NAUTILUS_SCRIPT_CURRENT_URI is not like file://...), use $NAUTILUS_SCRIPT_SELECTED_URIS

Resources