I've got an archive located at my web site with files in it that I'd like users of my app to be able to access. Files at this location may be changing so I'd like to be able download a file list that is then presented to the users, who would then click on the file(s) to download.
I think I've got the downloading handled with NSURLSession, but I can't find a direct way to get a list of the files located at http://www.example.com/archive/ in Swift. I feel like I'm missing something obvious. By the way, I'm not well versed in most aspects of "web programming" so small words would be appreciated if this involves stuff like POST and GET. Thanks.
Related
I'm exporting an excel file that's created dynamically at run time from DataTables in aspx.cs file at server side using ClosedXML lib. I want to let user select the downloading file location at client side, which is currently moved to downloads.
You unfortantly cannot do this. This is also why you can never select a local file or location from server side.
So, the users local file system is 100% off limits.
And the reason is quite simple. If you come to my web site to look at some cute cat picture? Well, while you are doing that, do you think it would be ok if my web code also starts looking around at your files? Hum, maybe a file called banking? Maybe a filel called passwords? Hum, how about I steal all your email files? How about I look for a Excel sheet called passwords?
So, when it comes to poking around, looking at, and deciding things like file locations? You cannot on the server side GET ANY information, nor can you even find and define what file to pick for up-loading, and the SAME applies to downloading of files. If I could pick a location, then gee, why don't I start overwriting some of your system files - including some that would give me remote access to your computer, right?
So, things like what folder, what file, even the computer name etc? These things are 100% hidden, off limits and simple not allowed. Now it would be possible for someone to come out with a new web browser tht allowed local file rights and access. But then again, no one in their right minds would ever use such a browser, and the security hole would be too large. As a result, for reasons of security, such information, and even simple knowledge of the local file system is not allowed, nor even exposed to the web server.
But then again, the user might be on a iPad, or android phone, and their file systems and how their folders works is not even the same as say a windows desktop computer anyway.
However, you can see with above, that your ability or even options to mess with, or even choose local file locations is not allowed for reasons of security.
So, if you web site provides a file, or even streams down a file, it will go into the download folder as per user browser settings - you unfortantly can't change this - it works that way due to security concerns.
I am looking for any reference material for me to read up on, relating to what enables the following scenario where a website has a unique identifer appended to their domain name.
When you go to Facebook and view your profile, the URL in the address bar is something lile;
https://www.facebook.com/your_user_name.number
There is no obvious file extension, nor is the 'your_user_name.number' being passed as querystring value. I do know that I could create a folder on the web directory which is this name, and then you can direct to that folder and it will autoload the default or index files based on your web server settings. But i am not sure this is happening in this case, as then Facebook would have to create 2 billion + folders?
Browsing to your Photos on Facebook, it the url then looks like;
https://www.facebook.com/your_user_name.number/photos
I am keen to understand what this type of technical configuration is called. Happy to read up on it myself and learn about it, but I don't even know what it's called to search and read up on.
Any pointers?
What you are looking for is URL rewrite.
https://learn.microsoft.com/en-us/iis/extensions/url-rewrite-module/creating-rewrite-rules-for-the-url-rewrite-module
You can basically do whatever you want! No need for extensions or folders at all if you don’t want them anywhere.
I'm running RStudio server on an instance of Google Compute Engine. My RScript creates a map file that I would like to include in a public web site.
The file gets created OK.
Separately, I've also created a bucket and can upload images to it, viewing them from a web browser with a URL like this: https://storage.googleapis.com/...
Still, I'm confused as to how to make the image created by the R script viewable by a browser. Does the image have to find its way over to a bucket? Or is it viewable where it is somehow?
There are infinite possible solutions depending on what you want to implement and how much time you want to spend on it (and if you are the only one accessing or not and if you can share the file or they are sensible), therefore I will provide you some hints:
The easiest one is to upload the file to a Google Storage Bucket, then you can control who can access that link (a single user, a domain or everyone), it could be access by accessing with the browser with the following link:
https://storage.googleapis.com/namebucket/folder1/folder2/nome_file
There is no graphical interface, you will need to know the address to download the file (at the end it is enough to know the name). You will need to create a small script to make sure every time a image is available to upload it to the bucket and to make it public available. Or you can decide to make he bucket itself public.
The second possible solution is to do the same but to create an html page REALLY simple, basically a list of links to the files in the bucket, each time you upload a file to the bucket you update the html file. At least you would solve the issue regarding the knowing the names and you can navigate it a bit.
<html><body>
This is a link
</body></html>
If you need to expose the resources to more people, or you would like to have something more "nice" graphically you will have to spend more time and build a decent frontend. You can follow thousands of different approaches.
You have really thousands of possibilities.
P.S.
Documentation regarding uploading a file to bucket.
Documentation regarding managing access to file stored.
Notice that in this way depending on the extension of the file you want to share the browser behaves differently, a .txt, a .jpg are shown an .exe is downloaded.
We have a asp.net website which allows users to upload files. Currently we do this by filestream. User clicks a button and we create the folder under site root> downloads > folder-idofuser > filename. Is there a better way than this. ie all files in a specific location outsite application ???. Paths are not stored in the database for access we just list the files in the relative folder on the file stream. Maybe we should do it different any advice?
They can upload as many files as they want. and presently we get about 30,000 per month. Due to our folder structure we are now experiencing slowness on the folder so we archived folders over 6 months old to a new location on the hd. Lots of files and was ok at the start but now with over 600gigs worth it a bit of a nightmare. Still leaves hundreds of thousands of folders under 1 folder.
I have read lots and lost of articles and questions im still not sure whats the best thing to do see 2 examples:
Looked at https://stackoverflow.com/questions/810215/best-practice-checks-when-allowing-users-to-upload-files-to-a-web-application
and Managing user-uploaded files with an ASP.NET website and Visual Studio
Now we are planning on moving from current in house hosting to a server farm. This seems very expensive when having so many files as we do. Should we have 2 servers 1 for applications/ websites and one for files and database. or would one suffice. obviously money is a big part of the equations but, trying to get a good idea as to whats best for the pound etc. any good ones please advise. And how should we store the files and access them. should we use sub domains etc and how.
Hope the questions not to boring, thanks all in advance
Isn't it better to save the files on the server with unique names (Guid) and save the path to the database.
This way you only need to query your database which is realy fast and not loop through all your files on your system.
And when someone requests the file just give it to the user. This could be realy fast i think. This way you can also have a script that will run and archive your files every month or day. And set a flag in the record that it is being archived with the new path to the file. This can be done automatically every night.
I've been messing around a bit with various solutions to what I would see as a fairly common problem, but I've not yet been able to solve it in a satisfactory way.
What I wish to achieve is some kind of functionality where a user can upload new files, or select existing files to reuse them.
What I've been using so far is a combination of the filefield, filefield_sources, imce and ckeditor modules. I guess ckeditor isn't really important for the solution, but I need to be able to embed images from the archive somehow, and this is done with IMCE . Since I do not want everything to be accessible from the filebrowser I created a subdirectory and set full access to it in the IMCE settings, lets call it default/files/site
This worked fine as long as all filehanding was done through IMCE, but when I uploaded files directly from the filefield my files ended up in the default/files root, so I set up folders for my fields, for example default/files/site/movies in a field that allowed the .flv format. This worked fine to, as long as I didn't try to access the files through IMCE. It appears the folders created by filefield are not accessible from IMCE?
I'm also in a position where I need to support large uploads (200MB+), but from my experience in other projects, allowing file uploads through FTP is usually a life-saver, but from what I understand IMCE won't support files not uploaded through Drupal in some way, since they are not present in the database (giving the message: The selected file could not be used because the file does not exist in the database.)
I'm aware that I don't really have a clear question to my problem, but somehow I need to figure this out pretty fast. How would I preferably solve this? I'm aware that I'm not the first to have this problem, but I have not yet been able to find a nice and stable solution. What am I missing?
Also check this thread (http://drupal.org/node/438940) and the reference to John Locke's work at: http://www.freelock.com/blog/john-locke/2010-02/using-file-field-imported-files-drupal-drush-rescue
Well, I'm not personally familiar with IMCE off of the top of my head, but if you need files that have been uploaded via ftp to be added to the files tables, then my impulse would be to write a small module which would then allow the user to click a button and start off a batch process. (This is me assuming that you are using Drupal 6, as the batch api doesn't exist in 5.)
Said batch process would then iterate over all of the files in the appropriate directory, which I would assume you had uploaded the files to, use file_copy() (from Drupal's file API) to copy the files to default/files/site, and then would add said files to the files table, which is actually quite simple with drupal_write_record().
It might not even need to use the batch api - it somewhat matters if you're just uploading 10-30 really big files, or 200-300 MB files.
For using the batch api, I'd look at http://drupal.org/node/180528 - this has a fairly basic example of how the batch api works, which basically consists of telling the api that you want to keep calling function_a, and then inside of function_a setting your progress in the context array until you're done, at which time the batch process finishes. Then you just have whoever uploads the files via ftp to hit a button on the website to move and register the files.