In the Office 365 Sharepoint Document Library. Can you automate folder approval, but NOT file approval? - directory

So I need to automate the approval for all folders created but NOT the files created in the doc library. However i see NO options for folders only for the files. Is this possible or am i wasting my time and need to move on? I have literally went through all of power automate and the share point setting site i cant find anything. I cant use coding outside of VBA due to company restrictions.

You can use the 'Get files (properties only)' SharePoint flow, then add a condition on the 'Is Folder' properties, and when True, set your desired approval.

Related

Make files public from Google Compute Engine

I'm running RStudio server on an instance of Google Compute Engine. My RScript creates a map file that I would like to include in a public web site.
The file gets created OK.
Separately, I've also created a bucket and can upload images to it, viewing them from a web browser with a URL like this: https://storage.googleapis.com/...
Still, I'm confused as to how to make the image created by the R script viewable by a browser. Does the image have to find its way over to a bucket? Or is it viewable where it is somehow?
There are infinite possible solutions depending on what you want to implement and how much time you want to spend on it (and if you are the only one accessing or not and if you can share the file or they are sensible), therefore I will provide you some hints:
The easiest one is to upload the file to a Google Storage Bucket, then you can control who can access that link (a single user, a domain or everyone), it could be access by accessing with the browser with the following link:
https://storage.googleapis.com/namebucket/folder1/folder2/nome_file
There is no graphical interface, you will need to know the address to download the file (at the end it is enough to know the name). You will need to create a small script to make sure every time a image is available to upload it to the bucket and to make it public available. Or you can decide to make he bucket itself public.
The second possible solution is to do the same but to create an html page REALLY simple, basically a list of links to the files in the bucket, each time you upload a file to the bucket you update the html file. At least you would solve the issue regarding the knowing the names and you can navigate it a bit.
<html><body>
This is a link
</body></html>
If you need to expose the resources to more people, or you would like to have something more "nice" graphically you will have to spend more time and build a decent frontend. You can follow thousands of different approaches.
You have really thousands of possibilities.
P.S.
Documentation regarding uploading a file to bucket.
Documentation regarding managing access to file stored.
Notice that in this way depending on the extension of the file you want to share the browser behaves differently, a .txt, a .jpg are shown an .exe is downloaded.

How to handle uploaded files on server farm

We have a asp.net website which allows users to upload files. Currently we do this by filestream. User clicks a button and we create the folder under site root> downloads > folder-idofuser > filename. Is there a better way than this. ie all files in a specific location outsite application ???. Paths are not stored in the database for access we just list the files in the relative folder on the file stream. Maybe we should do it different any advice?
They can upload as many files as they want. and presently we get about 30,000 per month. Due to our folder structure we are now experiencing slowness on the folder so we archived folders over 6 months old to a new location on the hd. Lots of files and was ok at the start but now with over 600gigs worth it a bit of a nightmare. Still leaves hundreds of thousands of folders under 1 folder.
I have read lots and lost of articles and questions im still not sure whats the best thing to do see 2 examples:
Looked at https://stackoverflow.com/questions/810215/best-practice-checks-when-allowing-users-to-upload-files-to-a-web-application
and Managing user-uploaded files with an ASP.NET website and Visual Studio
Now we are planning on moving from current in house hosting to a server farm. This seems very expensive when having so many files as we do. Should we have 2 servers 1 for applications/ websites and one for files and database. or would one suffice. obviously money is a big part of the equations but, trying to get a good idea as to whats best for the pound etc. any good ones please advise. And how should we store the files and access them. should we use sub domains etc and how.
Hope the questions not to boring, thanks all in advance
Isn't it better to save the files on the server with unique names (Guid) and save the path to the database.
This way you only need to query your database which is realy fast and not loop through all your files on your system.
And when someone requests the file just give it to the user. This could be realy fast i think. This way you can also have a script that will run and archive your files every month or day. And set a flag in the record that it is being archived with the new path to the file. This can be done automatically every night.

Handling file uploads in Drupal

I've been messing around a bit with various solutions to what I would see as a fairly common problem, but I've not yet been able to solve it in a satisfactory way.
What I wish to achieve is some kind of functionality where a user can upload new files, or select existing files to reuse them.
What I've been using so far is a combination of the filefield, filefield_sources, imce and ckeditor modules. I guess ckeditor isn't really important for the solution, but I need to be able to embed images from the archive somehow, and this is done with IMCE . Since I do not want everything to be accessible from the filebrowser I created a subdirectory and set full access to it in the IMCE settings, lets call it default/files/site
This worked fine as long as all filehanding was done through IMCE, but when I uploaded files directly from the filefield my files ended up in the default/files root, so I set up folders for my fields, for example default/files/site/movies in a field that allowed the .flv format. This worked fine to, as long as I didn't try to access the files through IMCE. It appears the folders created by filefield are not accessible from IMCE?
I'm also in a position where I need to support large uploads (200MB+), but from my experience in other projects, allowing file uploads through FTP is usually a life-saver, but from what I understand IMCE won't support files not uploaded through Drupal in some way, since they are not present in the database (giving the message: The selected file could not be used because the file does not exist in the database.)
I'm aware that I don't really have a clear question to my problem, but somehow I need to figure this out pretty fast. How would I preferably solve this? I'm aware that I'm not the first to have this problem, but I have not yet been able to find a nice and stable solution. What am I missing?
Also check this thread (http://drupal.org/node/438940) and the reference to John Locke's work at: http://www.freelock.com/blog/john-locke/2010-02/using-file-field-imported-files-drupal-drush-rescue
Well, I'm not personally familiar with IMCE off of the top of my head, but if you need files that have been uploaded via ftp to be added to the files tables, then my impulse would be to write a small module which would then allow the user to click a button and start off a batch process. (This is me assuming that you are using Drupal 6, as the batch api doesn't exist in 5.)
Said batch process would then iterate over all of the files in the appropriate directory, which I would assume you had uploaded the files to, use file_copy() (from Drupal's file API) to copy the files to default/files/site, and then would add said files to the files table, which is actually quite simple with drupal_write_record().
It might not even need to use the batch api - it somewhat matters if you're just uploading 10-30 really big files, or 200-300 MB files.
For using the batch api, I'd look at http://drupal.org/node/180528 - this has a fairly basic example of how the batch api works, which basically consists of telling the api that you want to keep calling function_a, and then inside of function_a setting your progress in the context array until you're done, at which time the batch process finishes. Then you just have whoever uploads the files via ftp to hit a button on the website to move and register the files.

ASP.NET File uploading-dynamic file names

I have a web page where i have an ASP.NET file upload control to upload files from client machine to Server.Now i want to do the uploading n number of times.Ex : I want to upload 100 files from my local pc to server.The 100 file names i can read from an excel file in my program.But is there any way to assign this file to the file upload control ?
No, as a security feature, FilUpload controls do not allow you to set what to download (imagine if you sign on to a website, and it is set to upload a passwords file or something).
Now there is probably another control, or a way to code around this, buut the FileUpload control will not allow it.
I would recommend using the jQuery Multifile Uploader which would take care of a UI (if you need one). And the actual uploads with Free ASP Uploads which takes care of the actual file transfer. Though it sounds like you are tkaing care of the programs programatically, so you can skip the multifule and just work with free asp upload.
You'll have to make your own Flash object or something to accomplish this, the basic HTML/ASP.Net controls won't let you do what you're looking for.
This will require creating some kind of an active or installable control. In order to get around the security hole of doing this, you're ultimately going to have to be able to execute code on the machine to select and upload the file.
And at that point, you're platform specific, so...
I would strongly suggest that instead of trying to have a web site automatically upload files for you, that you make a WinForms utility to accomplish this task and upload the files wherever you need, communicate with the web site over web services, etc.
This is a security restriction, you cant script the file selection of an upload box as it would allow hackers to write scripts to steal files off your computer.
You could use this silverlight upload utility which is my list of "things to use when I get the chance".
It has a nice UI and supports uploading many files at once. I originally tracked it down doing some research for a photography website that we were quoting for but that project fell through.
Anyway the project can be found here:
http://www.michielpost.nl/Silverlight/MultiFileUploader/
It also has full source code included so even if the control's developers abandon it you still have the choice to edit it yourself.

Where should I put my log file for an asp.net application?

I have a ASP.NET application that we've written our own logging module for.
My question is, where is the standard place to write a log file to? I.e. the website will be running as the anonymous user identity (e.g. IUSR on IIS7) and I need a place where I know it'll have permission to write to.
Cheers,
App_Data folder on the root of the project. It isn't served to web requests; so other people can't snoop for it.
I would suggest putting the log file onto a seperate disk, though should give you a little performance gain so that your not trying to both read and write to the same disk as the website. If you cannot put the log file on a seperate disk, then I would simply choose a folder of your choice.
In any case, you will have to give the "Network Service" account "Modify" permissions to the desired folder.
If on the other hand, you have access to a databse, then log the information there. It will be much quicker than accessing the hard drive and won't be publically available. You'll also be able to report from the data quite easily.
I'm not in a position to modify the permissions on folders (especially outside of the virtual directory home folder), and don't already have an App_Data folder, so am a bit hesitant to go with that.
So for the moment I'm going with the CommonApplicationData Folder.
On Vista/Server 2008 this is C:\ProgramData\
On XP/Server 2003 this is C:\Documents and Settings\All Users\Application Data\
I'm not in a position to modify the permissions on folders (especially outside of the virtual directory home folder), and don't already have an App_Data folder, so am a bit hesitant to go with that.
If you have a website, you clearly have a folder somewhere. Can you not add a (non-web-facing) subfolder? It seems like that would be a more appropriate place to put your logs than dumping them into a global, shared folder.
You could also log to the Windows Event log or to a table in a database. How often are people looking at the event log? If it's being examined on a regualr basis, writing to a table amkes the reporting back much easier as it's trivial to reverse the order and only show the last X events for the current time period. The Windows Event log you can also query the Windows Event Log through PowerShell or with LogParser.
Push the app_data is the best idea, just bear in mind, when the publishing the projects, if the option "Delete all existing files before publishing" is ticked, then the current data in the folder will be gone. The workaround is to skip the deletion of app_data folder.
Another option to do logging is to use some existing framework such as Log4net.

Resources