A customer's security policy requires to encrypt the data stored in a plone site on application level.
So it's okay, that a plone admin can access all information stored in the cms, but the sysadmin should not be able to copy files or browse database dumps for useful information.
(so using encfs is not an option because the sysadmin can do exactly that)
To enctrypt the database, cipher.encryptingstorage should be sufficient.
Is there a way to encrypt files in zope's blobstorage in a similar way?
we finally made cipher.encryptingstorage support blobs, too.
#pcdummy described the setup in a blogpost on our website
Related
The Situation: I'm going to implement a digital repository using alfresco community version 5.1 to manage our university digital content which is stored at a moment in differents ftp servers (software installers, books, thesis). I intent to use alfresco as a backend and Orchard CMS as our intranet frontend which is a non functional requierement and communicate both with CMIS. The general idea is that we use a social networking approch in which every user can modify metadata, add tags in order to improve the search, which by the way is the general objective of my work (allows searches and download to the digital content of our intranet , because right know it takes a lot of time to find anything because it is storage in a ftp server without a good cataloging).
I already successfully created a custom data model but when a decided to migrate the content from these ftps, i didn't find any documentation about it. I read about bulk import tool but it happent that i need the data locally in the same computer that runs alfresco, and as i said, the data source are different ftp server.
So How can i migrate data from differents ftps servers as datasource to Alfresco?. Is it necessary to physically import files to Alfresco or can i work with index pointing to the ftp files (keep the files in the ftps and have in Alfresco a reference of that object (I only have search and download functional requierements))?.
Please I need your help as a guidence because here in cuba we dont have experience working with Alfresco and it is very difficult to have access to internet. So if you can point out the way of fixing this, or any recommendation i will be forever greatfull. Thank You and Again so sorry to disturb You
If this were a one-time thing, you could use an FTP client of some sort to simply drag and drop the files from your FTP server into Alfresco's FTP server. This would copy the files only and would not set any custom metadata.
Alternatively, you could write some Java to do this. Java can read from FTP servers and can write to Alfresco via CMIS. This would give you the opportunity to set some properties on the objects written into Alfresco beyond just the file name, creation date, and modification date.
However, if you are going to do this regularly, you might want to look at an integration tool. For example, you could use Apache Camel to watch the FTP servers, and when there is a change, it could fetch the file and write it to Alfresco via CMIS.
This will likely take some coding to make it work exactly right, but hopefully this gives you some options to consider.
We have ColdFusion and Active Directory running within the same domain. I'm trying to write a script that ColdFusion can run to get the sizes of client directories within the file system. However, each directory requires different windows authentication.
ColdFusion <cfdirectory> doesn't have a means to impersonate the user for each directory. Using the JavaIO is also apparently not useful as Java apparently doesn't know anything about the Windows authentication.
I found a suggestion for using APS.NET to write a COM or .NET component, but the article didn't go any further. I have since googled my heart out and not found anything more useful.
If anyone has any information or assistance to offer, you can't imagine my gratitude.
ColdFusion will, by default, run as the Local System account. This account will not have access to the network.
I would suggest running the ColdFusion service with a domain account that has read access to the client directories and you should then be able to use cfdirectory.
You may find this blogpost helpful.
It seems the problem is that each directory in question requires a different set of user permissions. So he's right I think. CF can't "impersonate" each user - though I'm not sure that I know of a system that can do that. would you have to store separate permissions for each directory? Ask for a password for each one?
I don't think you can "impersonate" an AD account without creating appropriate tokens using the key infrastructure right? So that means you have to store username and passwords to "authenticate" with. I think storing and using such "impersonate" information would certainly negates any security enhancements you hoped to gain from using a different user for each directory.
Architecturally you are better off running CF as an AD user with "read" permissions to the directories in question - then you can access the size information you are looking for but would still not be able to modify the files within the directory. That's my take.
We decided to build an ASP.net application which will access the individual client directories using in-built windows authentication to return file sizes and other data and dump them into a database. We will use windows scheduler to schedule the running of this application. ColdFUsion will be used merely to manipulate the data into reports.
I seen some asp.net developers encrypt ConnectionStrings that was included in separate config file. Why they did that ? I know that config files are unreadable from the client side/browser! Is it possible to access to this kind of files ?
You can't rule out that the web box is compromised.
Also, you don't want the web admins to know passwords to databases.
You need to remember that config files cannot be obtained by the browser just because .config extension is in the list of restrictions in IIS metadata. It may be possible to get them from the server in other way or some misconfiguration problem may allow them to be downloaded.
They can be accessed by maintenance personnel, backup operators or others that have access to the disk without going through the website. That´s one example.
If you upload your web.config file with custom errors set to "off", any errors produced by your web application will display your code. This could even include lines from your config files and this could include the "ConnectionStrings" making them visible to the public.
To every organization the most important thing to them is their data
This is done where there are multiple developers working on the same
application
Sometimes, new developers are also enrolled in the team. Exposing each & every aspect of your database, system, login name, machine name is never a good approach
There is chance of information leak in production, testing Q/A phase etc.
This comes really handy when there is a code theft within an organization, atleaset your data is safe from an outside intrusion as the connection strings were encrypted
Can you bear the risk if someone has access to your database & perform a table/schema drop or a delete all from your tables?
MSDN: How to secure connection strings when using a datasource
Some of the problems that can happen are timeouts, disconnections, and not being able to resume a file and having to start from the beginning. Assuming these files are up to around 5gigs in size, what is the best solution for dealing with this problem?
I'm using a Drupal 6 install for the website.
Some of my constraints due to the server setup I have to deal with:
Shared hosting with max 200 connections at a time (unlimited disk space)
Shared hosting.
Unable to create users through an API (so can't automatically generate ftp accounts)
I do have the ability to run cron-type scripts via a Drupal module.
My initial thought was to create ftp users based off of Drupal accounts and requiring them to download an ftp client for their OS of choice. But the lack of API to auto-create ftp accounts and the inability to do it from command line kind of hinder that solution. If there's a workaround someone can think of, let me know!
Thanks
Usually, shared hostings does not support large files uploads through the browser. A solution may be to use another files hosting for your large uploads. A nice an easy solution to integrate is Amazon S3 and its browser based upload with a from POST.
It can be integrated in a custom module that provide an upload form protected using Drupal access control. If you require the files to be hosted on the Drupal server, you can use cron (either Drupal's or an external one) to move the files from S3 to your own hosting.
You're kind of limited in what you can do on a shared host. Your best option is probably to install SWFUpload and hope there aren't a lot of mid-upload errors.
Better options that you probably can't use on a shared host include the upload progress PHP extension (which Drupal automatically uses when it's installed) and, as you said, associating FTP accounts with Drupal accounts.
We're migrating one of our sites to ASP.Net. We do not want to use integrated security, which uses the windows account to connect to sql server (not going to get into why, its just out of the question). We created a username and password to connect to SQL Server, and would like to use that username and password, however, we also do not want other developers to see this information (easily read from the web.config).... I know it can be encrypted, but it can just as easily be decrypted by the developers - plus encryption has a performance hit.
Is there any solution to this problem?
here's a good tutorial on Encrypting Configuration Information in ASP.NET 2.0 Applications
Just don't give the other developers the key
Alternatively, you can lock down the authentication for SQL via installed certificates. This way you are setting security based on the client not the user. http://msdn.microsoft.com/en-us/library/ff649255.aspx
Our standard practice is to have one "Developer Login" used in the development database that has limited access and have a different username/password for the production box. Developers do not have access to the production box, only the lead developers, and then the production web.config is copied over via the deployment script.
Do the developers need access to the web.config file? If so, I think you may be out of luck. If not, meaning that they do not ever need to change the web.config file, change the permissions on it so that only admins and the asp.net process can read the file.
In my experience, it tends to be difficult to hide that kind of thing from your internal devs. Even encrypting the config infroamtion in the webconfig would still show if your developers just stepped through the code...
I would guess that, if you HAD to do this, you could create a private constant string in the code for your DB string and then use Dotfuscator or similar on the compiled application. Obviously, the source code itself would also have to be encrypted or your developers otherwise prevented from accessing it.
You can't really protect the password from developers - besides, what sense does it make?
What you can do is to have separate development server to which developers have access and production environment, to which they don't.
Don't developers ever need to log on directly to database to run some tests or something? if they do, it would make sense to do the test using the same account that is used by the application, otherwise the test results may not reflect reality.
prompt for the password, when you connect for the first time and track the passowrd in session. Now only you'll be able to connect the database from anywhere. Redirect all those to application-unavailable page to rest of the users that they don't have the password.