I can see that it is possible to upload files to be shared in the network, but if the subjets covered by the group are diverse, it might be interesting to put the files in distinct directory. (I am aware of the tagging option)
Yammer supports file uploads, but its primary focus is not as a file repository. If you are already managing files in your app, and are integrating with Yammer then you are much better to post messages to Yammer with Open Graph metadata attached. The OG metadata points to the URL that hosts the files in your application. That way you avoid duplication of files in Yammer and your application. The og_ methods on the messages endpoint are a useful starting point.
Related
I'm designing a website that will let the user synchronize a local folder to an online folder (Kinda like dropbox).
I'm trying to find a way to avoid developing a local tool to do the download and synchronization. And do it somehow online.
Is it possible to download multiple files in a certain directory tree?
Can a website have free write access to local directories?
A zip file is not an option, since the file batch could get pretty big.
EDIT: Synchronization shouldn't occur periodically. Just when the user logs in the website.
My recommendation is to use something like WebDAV then see here
Before I attempt to program the following function myself, I wonder if something already exists.
What I would like to do is click an edit link on my website for a given document, and have that document launch in the native editor on my local machine (via a temporary file mechanism).
When I save the document in the native editor, the document is HTTP PUT back to the website. This can be accomplished by watching the file for writes, or watching the editor process for exit.
This way I can more easily edit documents on the web (instead of going through the download / edit / upload cycle).
My design would work as follows:
Register .webedit files on the local machine.
When a .webedit file is downloaded, launch webedit.exe with the file.
The file contains a URL (http://server/document) which is checked against a security database to ensure we're only opening allowed URLs.
The URL is downloaded to a temporary location.
The temporary file is launched in the native editor.
The file is watched for changes, and uploaded (HTTP PUT) on change detection (or when the editor is closed, if it's not a single-instance multiple-document editor).
Lots of FTP / SCP GUIs have this type of functionality, but I have not been able to find it for the web in general, or a shared library that allows you to plug in to this function.
Has anyone seen a program that does this?
SharePoint works like this.
It's great for managing shared documents in corporate environments.
Users can even checkout/checkin documents & the features are very extensible..you can customize pretty much anything if you know how.
Edit:
Since you're on Linux..i've heard that Alfreco is a great alternative.
I've never used it, but I know a couple organizations using it instead of SharePoint.
It integrates with Microsoft Office as well.
Also, it will definitely be cheaper.
Q:
I want to ask if publishing the .cs and the .aspx files on my server during web application publishing process considered as a bad practice and may cause security violation or not?
because sometimes i have to do this because the report files doesn't published or the css files doesn't work properly .
When to use each option of those:
Only files needed to run this application.
All project files.
All files in the source project folder
This may be a misapplication of the principle, but I always think of the principle of least privilege. By that, I mean:
Do my users need to see any code files (applicable under both "All project files and "All files in the source project folder"?
Do my users need to see any files in my project folder, but not included in my project (applicable under "All files in the source project folder")?
If the answer to those questions is no, then I publish using only files needed to run this application.
I once made the mistake of publishing a website using "All files in the source project folder", because I needed to deploy a bunch of .css and .js files from a plug-in I used, and didn't know how to quickly include those files in my web project.
However, as soon as I saw all my source code show up in my production folder, I quickly switched my publish option back to "Only files needed to run this application", and deployed deleting all files in the target folder. Then, I looked around to find a way to include all files in a folder that was not in my project, and I've been happier since.
Honestly, even if my users needed to see code of some sort, I'd consider writing a quine before I'd publish copies of my .cs file on any website. People have differing opinions about Internet security, but I often think of this quote from Gene Spafford:
The only truly secure system is one that is powered off, cast in a block of concrete and sealed in a lead-lined room with armed guards - and even then I have my doubts.
If you look around here, you'll find various questions where users are trying to safely encrypt/decrypt connection strings, store data securely in their programs (or databases), and are otherwise trying their best to keep anyone -- even their most trusted users -- from getting access they otherwise shouldn't have.
As unlikely as it might be that a malicious user would try to access the files on your server, I can tell you that it's a lot harder for a malicious user to access the files on my server, because those files don't exist on my server.
Ensure your IIS settings mean that .cs files are not served publicly. This should be the same with any sensitive or non-public filetypes, such as .config.
.aspx files contain your markup, so are typically fine to publish and serve publicly.
I am currently working on a project where i need to store few files and folders in encrypted manner. This project will be platform independent and hence will be written in Java.
Instead of encrypting individual file and folder, we have been thinking of using some virtual file-system where a single container file will hold complete file-system.
Most of the open source virtual encrypted file-system tools we studied work on following principle.
mount the virtual file system (using secure password)
use this filesystem
finally dismount it
But the main problem here we face is that anyone who has access of the PC (e.g. network admin) will be able to see decrypted files when virtual drive is mounted. We want to restrict access to encrypted file system at process level. No one else in same OS session should be able to see the contents, hence no drive mounting, etc.
So we are looking for some open source tool which will provided some some APIs using which we will be able to access files in encrypted container without mounting it.
can anyone point us to any such library?
This thing I'd normally say was pretty cool.
http://www.pismotechnic.com/pfm/
But I've recently accidently copied a sub-repository in a mercurial repository to another folder and when that happened a lot of files got magically messed up. If you don't mind possible issues like that (eg. keeping backups) this could be a solution for you.
I've stumbled upon this question while hunting for an alternative because corrupted files are definitely not on my requirement list.
I'm building an app in ASP.NET that will store some pictures of objects. The pictures will be uploaded by suppliers and downloaded by subscribers. In between, they will have to be edited before becoming available to subscribers.
The editing involves creating a cropping path tightly around the object in the picture, in which some advanced desktop image software will have to be used I suppose.
My problem is in exchanging pictures between my ASP.NET app and the desktop software in a manner that is easy and transparent for the user.
I've done some thinking and I've come up with:
- Manually downloading and uploading the image (Not much user friendly...)
- An image editing program that can upload to a web service (Haven't found yet...)
- Develop a plug-in for an image editing program (Too advanced...)
I'd appreciate any suggestions you may have, thank you!
It sounds like you need some automation to move files between the web server and a file share. I am assuming that the number of images that need to be processed is pretty large, because if it's not, then the overhead of downloading/re-uploading each would not be that much.
So do the following:
1) Create an API for your web app that lists files that are available, or new files since some date/time, or files that have been marked as "new". The API should probably also allow marking a status on them (so you can tell it when you've finishing pulling something down, and it won't be offered again) if you don't want to trust date/time as an indicator of it being new.
2) Write an app (non-web) that runs on a schedule and uses this API to automatically download files to a shared filesystem area in your local network, and marks them as "downloaded"
The app should also monitor these files (the ones it downloaded & saved to your local share) for changes, and if changed, upload them back to your web app. To do this you may need to keep a database of filenames and modification dates/times.
This shouldn't be too hard to write in whatever language you are using for your web (assume c# or vb). By "API" I just mean, a web page that provides a list in a standardized format (e.g. json) that you can parse with your automation application, and another page that allows posting the file back for re-upload.
I'm assuming that the web server is not your own, or generally, you can't simply have it save the file uploads directly to some area where your image editors can access them. Otherwise you could just do that.
Meanwhile I came out with another possible solution.
I'm thinking of having our own windows app on the editor's computers. This app will be associated with a custom extension. When an editor downloads a file (with this extension) for editing, it will be opened in our application which in turn will open the image in some editor program.
This app will be monitoring the files for changes, and in such case, it will upload these images.
Any thoughts on this?