I know that Dynamics CRM 2015 comes with encryption on for specific fields.
EmailServerProfile IncomingPassword
EmailServerProfile OutgoingPassword
Mailbox Password
Queue EmailPassword
UserSettings EmailPassword
I want to use the built-in encryption feature to encrypt custom fields (columns). My instance of CRM has SSL and Data Encryption enabled.
How can I do this?
EDIT: Query to find encrypted columns.
SELECT [TableColumnName], [IsEncrypted]
FROM [Our_Organization].[MetadataSchema].[Attribute]
WHERE IsEncrypted = 1
I haven't seen anything saying that you can encrypt custom fields, so I Think the answer is "no".
Related
I'm create SQLite databse with DB Browser for SQLite (non encrypted) and open with FireDAC in delphi.(Can retrive data Eg. Select * from abc).
How encrypt this SQLite database with FireDAC? When enter username, password and encrypt get message "Cipher DB is not encrypdet"
Note:
When create SQLite database from Delphi FireDac I can use encryption!
To encrypt a database, use a TFDSQLiteSecurity Component. You'll also need a TFDSQLitePhysSQLiteDriverLink component to go along with it.
If a database is unencrypted, then its password is ''. So use '' as the OldPassword and create the new password in that case. Passwords are formatted as algorithm:PassPhrase. See documentation on the choices, I use aes-256. Also, the database needs to be closed when you do this.
...
//Change password
FDSQLiteSecurity1.Password := OldPassword;
FDSQLiteSecurity1.ToPassword := NewPassword; // example: 'aes-256:mypassword123'
FDSQLiteSecurity1.ChangePassword;
...
//Remove Password
FDSQLiteSecurity1.Password := OldPassword;
FDSQLiteSecurity1.ToPassword := '';
FDSQLiteSecurity1.RemovePassword;
...
From the Documentation
SQLite Encrypted Database
Approach
One of the distinctive SQLite
features is the high-speed strong database encryption. It allows you
to make database file content confidential and enforce integrity
control on the database file. The encrypted database format is not
compatible with other similar SQLite encryption extensions. This means
that you cannot use an encrypted database, encrypted with non-FireDAC
libraries. If you need to do this, then you have to decrypt a database
with an original tool and encrypt it with FireDAC.
Recent Delphi versions come with an example project for working with encryption on Sqlite databases, see this documentation. I have not used this myself, btw.
It includes this section
Encrypt DB
Encrypt: Encrypts the database according to the Encryption mode and the password provided.
The sampe uses TFDSQLiteSecurity.SetPassword to encrypt the database with the password provided.
The database password is the combination of <encryption algorythm>:<password>.
I have faced several challenges when first time tried to encrypt SQLite database for use with Embarcadero FireDAC. Also all information is published by Embarcadero question pops up again and again on different forums. My case was solved based on community support, but when time has permitted simple Delphi application was assembled and available on Sourceforge. Hope it will make encryption/decryption slightly easier particularly for the newbie
https://sourceforge.net/projects/sqlite-sequrity-for-delphi/
Is it possible to "retarget" keys generated via the ncipher JCE API to pkcs11? I know that you can retarget via the generatekey command but I don't see how to do it to an existing JCE key. The first prompt is for the "source application" and the options don't seem to include JCE. Does it support other options beyond the ones listed there or should I be looking at a different way of retargeting?
The ultimate goal here is to export a couple keys (asymmetric and symmetric) that were generated via nCipher's JCE API (yes, I know that an HSM's job is to secure the keys and exporting is usually not a good idea but it is a requirement here). We are able to export keys that were generated via the PKCS11 interface but not ones that were generated via the JCE so our thinking is that if we can retarget it from JCE to PKCS11 we might be able to export these keys as well. If there is another way to do this we are open to that as well.
Lastly, the JCE keys show up as "recovery enabled" when executing the nfkminfo on them. Does that mean that they are exportable or does recovery here mean something else?
Disclaimer: I work for Thales e-Security but do not speak for the company.
Yes you can retarget a jcecsp key to pkcs11. If you have any jcecsp keys in your kmdata/local, /opt/nfast/bin/generatekey will offer jcecsp as a source option. If you have no keys of that ilk, it will quietly omit that option from the source list. However, this retarget process may not do what you think it does. All retargeting does is change the application type and potentially the associated metadata: it doesn't change the fundamental capabilities of the key as those were baked into the protected key blob at generation time and cannot be changed.
The Security World uses nShield key ACLs to limit the key's capabilities (Sign, Verify, Encrypt, Decrypt, Wrap, Be Wrapped, etc.). PKCS#11 pulls its parameters (CKA_SIGN, etc.) directly from the key ACLs, and when generating keys through the API, the ACLs saved in the key blob are derived directly from the parameters in the key template. If you set CKA_SENSITIVE to FALSE, and your Security World allows it, you can generate and save an exportable key. JCE is not that sophisticated: it has no concept of key capabilities at all, so the Provider has to guess at the user's intent with the key and it defaults to a fairly generous set. However, since as you point out the whole idea of HSMs is to protect key bits and not let you have them, Export is not one of the defaults. And what's not baked into the key file when you create it, you don't get by retargeting the key.
One thing you could do if you want to use JCE is to generate the key using a different Provider and then store it in an nCipher.sworld KeyStore using the nCipherKM Provider: this will import the key into the Security World (if your World allows that) and save it as a key_jcecsp_* file. However this has nothing to do with key security so from an HSM perspective it's not recommended. Another thing you could do is to drop down to the native nCore API, generate the key with the ACL entries you require, and then polymorph it to a JCE Key Object and save it in the HSM-backed KeyStore. You can shoot yourself in the foot as many times as you want with the ACLs on the key you create. The polymorphing is very poorly documented: ask Thales Support and they can guide you.
Finally, the Recovery capability means that in addition to the Working Key blob which may be protected by an Operator Card Set, the key file has a Recovery Blob. This is in case that Operator Card Set is lost: the Recovery Blob can be opened up by the Administrator Card Set of the Security World using the rocs utility (Replace Operator Card Set), which will write a new key file under a new OCS. No, this does not mean the key is exportable. It just means that you are protected against losing the OCS. Of course losing the ACS is a non-starter as that is your Root of Trust.
I have been searching for an answer on MS, SE and Google and cannot find it. I want to use the GRS option for Azure Storage (Cloud Block Blobs) but I cannot figure out how to properly do that.
I created my storage object in Azure and chose the GRS option.
I get that I have a primary and secondary connection string and know how to get that from the Azure portal.
What I do not know, in ASP.NET 4.0, is how to set both connection strings in the CloudBlockClient and gracefully handle the primary storage being unavailable.
--What exception is thrown and where, when primary is unavailable? Is this thrown when I create the client, or when I try to get a blob reference?
-- How do I then use the secondary?
Do I have to just test for any old exception and then try using the secondary connection string in a new CloudBlockClient if the primary does not work? Or is there anything in the API for this. I would think there would be but I cannot find it.
None of the "How to use Azure Storage" tutorials I have seen go into this. Most of the documentation seems to date from before mid-2014 when this feature became generally available.
This blog post should help you. In short if you want to read from both primary and secondary you want to enable RA-GRS - essentially read access from the secondary. If you are using out storage client libraries you can also enable a retry policy that will first try to read from a primary and then from the secondary if the first read fails.
I'm working on a web portal for customers that will connect to Microsoft Dynamics. I don't want to make Dynamics CRM directly a internet facing deployment (IFD), so I'd like to use a separate database that the web interface interacts with and then use web services to move the data between the web portal database and Dynamics CRM.
I'm just looking for thoughts on whether this is the best way to proceed and whether there are any good code examples, etc. that I can look at for implementing this?
I saw Microsoft has a Customer Portal but it looks like it requires (at a cursory glance) an IFD deployment - which I don't want.
First, after creating your ASP.NET project (WebForms or MVC 3), add the following references:
Microsoft.crm.sdk.proxy.
Microsoft.xrm.sdk.
System.Runtime. Serialization.
System.ServiceModel.
In your code-behind Create a class then add the following code:
private IOrganizationService GetCrmService(string userName, string password, string domain, Uri serviceUri)
{
OrganizationServiceProxy _serviceProxy;
ClientCredentials credentials = new ClientCredentials();
credentials.Windows.ClientCredential = new System.Net.NetworkCredential(userName, password, domain);
//credentials.UserName.UserName = userName; // uncomment in case you want to impersonate
//credentials.UserName.Password = password;
ClientCredentials deviceCredentials = new ClientCredentials();
using (_serviceProxy = new OrganizationServiceProxy(serviceUri,
null,
credentials,
deviceCredentials))
{
_serviceProxy.ServiceConfiguration.CurrentServiceEndpoint.Behaviors.Add(new ProxyTypesBehavior());
return (IOrganizationService)_serviceProxy;
}
}
If you want to retrieve multiple records:
string fetch = #"My Fetch goes here";
EntityCollection records = getCrmService().RetrieveMultiple(new FetchExpression(fetch));
I highly recommend to download the SDK or check this
You'll find many samples and walkthroughs which will help you to build good portals.
I think it's a good strategy because:
It allows you to asynchronously put the data entered on the website into the CRM. This decoupling ensures neither the CRM nor the Website will become eachother's bottleneck.
Only the intermediate service layer is internet facing, so you'll be in control over what CRM information would be disclosed/open for alteration if this service layer is compromised.
The architecture you're after is reminiscent of the way the CRM Asynchronous Service works (asynchronous plugins and workflows work this way).:
A job is put in a queue (table) in the CRM DB.
A scheduled service awakes every x seconds and fetches the latest y records from the queue table.
The service performs each job and writes the result (success, error message log) back to the queue table's records.
So the thing that is probably hardest is writing a good scheduled service that never throws an exception (but always digests it) and properly logs the results back to the DB.
To learn more about the Dynamics CRM's "Asynchronous Service Architecture", refer to the following: http://msdn.microsoft.com/en-us/library/gg334554.aspx
It looks like a good approach.
It will improve the performance of both the portal and CRM.
The data shown on portal is NEARLY realtime. i.e it is NOT realtime.
Throughout the development, you better keep checking that there is not TOO MUCH async processing to keep the CRM server busy all time.
I don't think, that the accelerators/portals REQUIRE CRM to be an IFD instance, I guess only the portal part needs to be Internate facing (of course to make it usable for the purpose!)
Anwar is right, SDK is a good lauchpad for such research.
Customer Portal Does not require IFD deployment. And if you do not like the Customer Portal you can always use SDK Extension for Portal development (microsoft.xrm.client.dll & microsoft.xrm.portal.dll and portalbase solution) which are all included in SDK.
There is a great resource regarding how to build portal by using SDK Portal Extenstion.
Dynamics CRM 2011 Portal Development
We are using Informix as DB for our application.
We have a new requirement to encrypt one column (ID) alone. The encryption should not be external and should be in DB itself.
IBM explains the encryption procedure in http://publib.boulder.ibm.com/infocenter/idshelp/v10/index.jsp?topic=/com.ibm.sqls.doc/sqls1024.htm
The steps are as follows:
SET ENCRYPTION PASSWORD 'credit card number is encrypted'
WITH HINT 'Why is this difficult to read?';
INSERT INTO customer VALUES ('Alice',
encrypt_tdes('1234567890123456'));
INSERT INTO customer VALUES ('Bob',
encrypt_tdes('2345678901234567'));
SELECT id, DECRYPT_CHAR(creditcard,
'credit card number is encrypted') FROM customer;
But when I follow the same step, DB is throwing the error in the first step itself (at set encryption password).
"SQL -26040: Encrypt VP initialization failed."
I am not sure what is the actual issue, as I couldn't find a satisfying solution.
Could someone help us to solve this?
The issue was with Encrypt VP server not present for Informix. We have installed Encrypt VP server and then the encryption was successful.
Thanks,