I have to upload and download files from blob storage. Found a good article on tutorial to upload and download files. I have some queries though.
I want to create folder structure and do operations like
a. Fetch a particular file from folder
b. Fetch all files of a folder and its subfolders
c. Fetch name of files which are in a particular folder
d. Fetch name of files which are in a particular folder and its subfolders
Upload files to a particular folder or subfolder
What are the best practices for doing so and should I use queue in all this?
What would be performance impact if I am uploading large files to blob?
You can't really use queues for that purpose. Reasons being:
Maximum size of a message in a queue is 64 KB. What would happen if your file size is more than 64 KB?
More importantly, queues are not meant for that purpose. Queues are typically used as asynchronous communication channel between disconnected applications.
Do search around and you will find plenty of examples about uploading files in blob storage.
For uploading folders, essentially you will iterate over a folder and list all files and upload these files. Since blob storage doesn't really support folder hierarchy, you would need to name the blob by prepending the folder structure to the name of the file. For example, let's say you're uploading files from C:\images\thumbnails folder in a blob container named assets. If you're uploading a file called a.png, you can name the blob as images/thumbnails/a.png and that way you can preserve the folder structure.
Related
I have an ASP.NET core API project which lets users upload images. My first implementation was to use Base64 and save the images in the SQL server. However, I decided not to do that because of performance issues. The second implementation was to use Azure Blob storage and upload the files directly into the blob storage.
I am not sure if this is a good idea but instead of using Azure Blob, I would like to upload the images somewhere in my Linux server. Is there any special directory for saving files and would it be safe for me to do that?
As far as I know, there is no special directory for saving files in Linux server which would be more safe.
All the folder in the linux is the same, if you have enough permission to access it ,then you could read and write the image in it.
Normally, we will add a folder inside our application to store the uploaded image, so that we could use relative path in our codes.
In my opinion, use blob storage is a good opinion. This will be directly access from blob url if you have enough permission and it is safety, we could generate the SAS to allow only specific user access, it is High durability and cheap enough.
I'm running containerized Alfresco in Docker (pom shows alfresco-core.version 7.21). According to the official documentation, the files should be stored as a .bin file in \alf_data\contentstore, but when I go into the alfresco container, alf_data is an empty directory. Even when I search the whole container for .bin files I find nothing related to my files.
Can anyone tell me how I can find my files?
Thanks!
Look in your Docker Compose file and see if an external volume as been defined. It is likely, as any content stored directly in the container would be ephemeral. Using a volume allows content to be written to the host file system.
Just in case you were tempted, though, you shouldn't be doing anything with those files directly. The Alfresco content store uses a hashed directory structure and renames all files using a GUID and an extension of "bin".
You should check your repository.properties/alfresco-global.properties files and look for the configured location. Note, only files - as in Word, PDF, etc - will be stored on disk and metadata goes into the database.
https://hub.alfresco.com/t5/alfresco-content-services-forum/changing-the-location-of-contentstore-dir/td-p/215540
I'm creating an app where users need to work with large databases. Rather than having the users download the data and populate an SQLite database on the client (which would take a long time), I'm hoping I can use downloadable, pre-populated databases.
I found cordova-sqlite-ext, which allows working with pre-populated databases, but SQLite files must be located in the www folder for this to work. Is it actually possible to download files to this folder in Ionic/Cordova (on non-rooted devices)?
It's always good practise to store your files in app's directory. Check this comment of mine and see if it helps you:
https://github.com/driftyco/ionic-native/issues/881#issuecomment-270832521
I had a requirement of downloading a zip file(with sqlite file in it), unzip the file, store the file in app's directory and query the DB. I was able to achieve it using plugins and works quite well.
I'm building a Asp.net site where users can upload files into their user account and then download them whenever they are logged in. The files will typically be less than 5MB and users can only download the files that they have uploaded (i.e can't download someone else's file). There are around 100k users and each could potentially upload around 2 or 3 files. The live site is load balanced
I'm thinking that storing these files in the central DB (Sql Server) as a BLOB would be nice because...
As the site is load balanced, each node can access the file from the central DB. No need to worry about having shared folder to store the files
I can more easily ensure that user's only download their own files.
Backing up the DB automatically includes the file BLOBS
Only downside to this I've read is performance, but how bad can this be?
If I were to store these files in the file system, would there be any problem storing it all in one folder?
What is the best approach for this?
If you want it fast, then store the data to the filesystem. In this case you dont need to read the amount of data from the sql server. In this case you could do two things:
1st approach: Store all files in one single folder (NTFS limit: 4,294,967,295, FAT limit: 268,435,437)
2nd approach: Create for each user (userid) a separate subfolder. i would prefer this over the 1st approach.
With newer versions of SQL Server you can also use FILESTREAM.
It would be also interesting to knew
How many users do you have
how many uploads per day/hour/minute could you have
how many downloads per day/hour/minute could you have
how many uploads at the same time could you have
how many downloads at the same time could you have
what is the size of the systems you have
is it internally used/externally - which bandwidth/network load do you have etc.
I'm developing an application using Adobe Flex 4.5 SDK, in which the user would be able to export multiple files bundled in one zip file. I was thinking that I must need to take the following steps in order for performing this task:
Create a temporary folder on the server for the user who requested the download. Since it is an anonymous type of user, I have to read Sate/Session information to identify the user.
Copy all the requested files into the temporary folder on the server
Zip the copied file
Download the zip file from the server to the client machine
I was wondering if anybody knows any best-practice/sample-code for the task
Thanks
The ByteArray class has some methods for compressing, but this is more for data transport, not for packaging up multiple files.
I don't like saying things are impossible, but I will say that this should be done on the server-side. Depending on your server architecture I would suggest sending the binary files to a server script which could package the files for you.
A quick google search for your preferred server-side language and zipping files should give you some sample scripts to get you started.