SAML with a SKI instead of certificate in X509Data node - x509certificate

I have already implemented authentication mechanism based on SAML protocol. Project use SAML2 library. Everything worked fine until change on the server has ocurred. The server used to respond with <ds:X509Certificate> node:
<ds:KeyInfo><ds:X509Data><ds:X509Certificate>Here was certificate</ds:X509Certificate></ds:X509Data></ds:KeyInfo>
But it has changed to:
<ds:X509SKI>Here is Subject Key Identifier</ds:X509SKI>
SAML2 library has CheckSignature method which can be applied on server response:
/// <summary>
/// Checks the signature.
/// </summary>
/// <returns>True of the signature is valid, else false.</returns>
public bool CheckSignature()
{
return XmlSignatureUtils.CheckSignature(Document);
}
It points here:
/// <summary>
/// Verifies the signature of the XmlDocument instance using the key enclosed with the signature.
/// </summary>
/// <param name="doc">The doc.</param>
/// <returns><code>true</code> if the document's signature can be verified. <code>false</code> if the signature could
/// not be verified.</returns>
/// <exception cref="InvalidOperationException">if the XmlDocument instance does not contain a signed XML document.</exception>
public static bool CheckSignature(XmlDocument doc)
{
CheckDocument(doc);
var signedXml = RetrieveSignature(doc);
if (signedXml.SignatureMethod.Contains("rsa-sha256"))
{
// SHA256 keys must be obtained from message manually
var trustedCertificates = GetCertificates(doc);
foreach (var cert in trustedCertificates)
{
if (signedXml.CheckSignature(cert.PublicKey.Key))
{
return true;
}
}
return false;
}
return signedXml.CheckSignature();
}
And finally GetCertificates method looks like that:
/// <summary>
/// Gets the certificates.
/// </summary>
/// <param name="doc">The document.</param>
/// <returns>List of <see cref="X509Certificate2"/>.</returns>
private static List<X509Certificate2> GetCertificates(XmlDocument doc)
{
var certificates = new List<X509Certificate2>();
var x509CertificateNodeList = doc.GetElementsByTagName("ds:X509Certificate");
if (x509CertificateNodeList.Count == 0)
{
x509CertificateNodeList = doc.GetElementsByTagName("X509Certificate");
}
foreach (XmlNode xn in x509CertificateNodeList)
{
try
{
var xc = new X509Certificate2(Convert.FromBase64String(xn.InnerText));
certificates.Add(xc);
}
catch
{
// Swallow the certificate parse error
}
}
return certificates;
}
As you can see the library checks only certificates not subject key identifiers. I believe I can implement SKI comparison between installed certificate and provided element on my own, but I'm not sure if this is legal way to do it.
Here Thomas Pornin wrote:
The Subject Key Identifier does not play a role in validation, at
least not in the algorithm which makes up section 6 of RFC 5280. It is
meant to be an help for path building
His statement suggest I can't do validation by comparing SKI from server response and installed certificate.
RFC 5280 Suggests the same, but I don't have enough time to read it carefully, so I'm searching for your help.
Is comparison of subject key identifier of installed X509 certificate and those in SAML response right way to verify response?

No, as already mentioned, SKI is used only to bind certificates in the chain (when key match is used). It doesn't provide enough information about the certificate and its details.
However, if client have a full certificate preinstalled, client could use SKI to locate the right certificate and use the cert for validation procedures.

Related

How to convert CertificateRequest in base64string format to CertificateRequest object in .NET Core [duplicate]

I want to load a Certificate Request (CSR) from its serialized form and sign it.
Is this possible in pure .NET?
The CSR looks like this:
-----BEGIN CERTIFICATE REQUEST-----
MIIDejCCAmICAQAwZTE0MDIGCgmSJom....
-----END CERTIFICATE REQUEST-----
It was generated using .NET 4.7.2 CertificateRequest, similar to the answer in this question:
Generate and Sign Certificate Request using pure .net Framework
The serialized CSR is then sent to a server, which needs to create the certificate - the question is how to do that.
Do You Really Want To Do This?
Parsing a Certification Request (colloquially known as a Certificate Signing Request or CSR) and signing it blindly is a very, very bad operational practice.
If you want to be a Certificate Authority, even a private one, you should read and understand everything in the CA/Browser Forum's current (as of whenever you read this) Baseline Requirements document at https://cabforum.org/baseline-requirements-documents/. Maybe you intentionally decide something doesn't apply to you, but then at least it's intentional.
At minimum you should be checking that the request:
Doesn't grant itself CA authority (hint, issue your signing certificate with a pathLenConstraint of 0 to help block this), unless of course you intend to create a subordinate CA (but, probably not).
Uses only approved key usage and extended key usage values.
Uses only approved subject name and Subject Alternative Names extension values (if the request has no EKU extension, or contains the TLS Server usage).
Doesn't define extensions that interfere with the operation of your CA (Authority Key Identifier, Authority Information Access, Issuer Alternative Name, CRL Distribution Points, ...)
Doesn't define any extensions you don't understand (e.g. the Certificate Transparency "poison" extension) / authorize for the request.
Are You Sure You Really Want To Do This?
.NET doesn't have built-in support for reading Subject Alternative Names, which you're supposed to verify. (Don't use string parsing, use something like System.Formats.Asn1.AsnReader)
You probably also want to add an Authority Information Access extension, Authority Key Identifier extension, and probably CRL Distribution Point extension to the requests you issue, there's no built-in support for that either.
https://github.com/dotnet/runtime/blob/8b8c390755189d45efc0c407992cb7c006b802b5/src/libraries/Common/tests/System/Security/Cryptography/X509Certificates/CertificateAuthority.cs does have examples of all of this (for the tests of X509Chain).
.NET doesn't have built-in support for writing CRLs or reading OCSP requests or producing OCSP responses, so you're on your own for revocation.
https://github.com/dotnet/runtime/blob/8b8c390755189d45efc0c407992cb7c006b802b5/src/libraries/Common/tests/System/Security/Cryptography/X509Certificates/RevocationResponder.cs (again, from the tests for X509Chain).
There's a whole lot of operational processes you need to deal with (see the CA/Browser Forum Baseline Requirements)
If you insist...
This code uses the new System.Formats.Asn1 package (specifically, it was tested with version 5.0.0-preview.8.20407.11 [which should be stable version 5.0.0 in November 2020] on .NET Framework 4.8 from an executable built targeting .NET Framework 4.7.2).
It does verify that the proof-of-private-key-possession signature is valid, and in doing so limits itself to RSA-SSA-PKCS1_v1.5 signatures (no ECDSA, no RSA-SSA-PSS). Adding other algorithms is (of course) possible.
This code DOES NOT provide any sort of operational policy. It's up to the caller to verify that only appropriate extensions are used (including that "critical" bits are appropriate), that names are all appropriate, and, well, anything else aside from "it can be decoded and the subject public key verifies the request signature".
There's an API oddity in that you need to tell the decode routine what hash algorithm you eventually intend to use when signing the request, because CertificateRequest requires it in the constructor to make subsequent signing calls easier.
OK, I think that's enough disclaimer, along with some more disclaimers in the code. So, here's enough code to be a "terrible" CA.
internal static class CertificationRequestDecoder
{
private const string BadPemRequest = "Input is not a PEM-encoded Certification Request.";
/// <summary>
/// Load a CertificateRequest from a PEM-encoded Certification Request
/// (a.k.a. Certificate Signing Request, CSR)
/// </summary>
/// <param name="pem">The PEM-encoded Certification Request</param>
/// <param name="signatureHashAlgorithm">
/// The hash algorithm to be used with the CA signature.
/// </param>
/// <returns>
/// A certificate request object containing the same data as the signing request.
/// </returns>
/// <exception cref="ArgumentNullException"><paramref name="pem"/> is <c>null</c>.</exception>
/// <exception cref="ArgumentException">
/// <paramref name="pem"/> is not a well-formed PEM encoding for a Certification Request.
/// </exception>
/// <exception cref="AsnContentException">
/// <paramref name="pem"/> does not contain a well-formed Certification Request.
/// </exception>
/// <exception cref="InvalidOperationException">
/// The request contains unsupported elements.
/// </exception>
/// <exception cref="CryptographicException">
/// The Certification Request signature is invalid.
/// </exception>
/// <seealso cref="DecodeDer(ReadOnlyMemory{byte},HashAlgorithmName"/>
internal static CertificateRequest DecodePem(
string pem,
HashAlgorithmName signatureHashAlgorithm)
{
if (pem == null)
throw new ArgumentNullException(nameof(pem));
// This PEM reader is overly lax. It should check for a newline at the end of preEB
// and another at the beginning of postEB, but it skips it for Unix/Windows newline
// reasons.
//
// After all, this is just a sample, right?
const string PreEB = "-----BEGIN CERTIFICATE REQUEST-----";
const string PostEB = "-----END CERTIFICATE REQUEST-----";
int startIdx = pem.IndexOf(PreEB, StringComparison.Ordinal);
int endIdx = pem.IndexOf(PostEB, StringComparison.Ordinal);
if (startIdx < 0 || endIdx < 0)
throw new ArgumentException(BadPemRequest, nameof(pem));
if (startIdx != 0 && !string.IsNullOrWhiteSpace(pem.Substring(0, startIdx)))
throw new ArgumentException(BadPemRequest, nameof(pem));
if (endIdx < startIdx || !string.IsNullOrWhiteSpace(pem.Substring(endIdx + PostEB.Length)))
throw new ArgumentException(BadPemRequest, nameof(pem));
byte[] der;
try
{
int base64Start = startIdx + PreEB.Length;
string base64 = pem.Substring(base64Start, endIdx - base64Start);
der = Convert.FromBase64String(base64);
}
catch (FormatException e)
{
throw new ArgumentException(BadPemRequest, nameof(pem), e);
}
return DecodeDer(der, signatureHashAlgorithm);
}
internal static CertificateRequest DecodeDer(
byte[] der,
HashAlgorithmName signatureHashAlgorithm)
{
if (der == null)
throw new ArgumentNullException(nameof(der));
return DecodeDer(der.AsMemory(), signatureHashAlgorithm);
}
/// <summary>
/// Load a CertificateRequest from a DER-encoded Certification Request
/// (a.k.a. Certificate Signing Request, CSR)
/// </summary>
/// <param name="der">The DER-encoded Certification Request.</param>
/// <param name="signatureHashAlgorithm">
/// The hash algorithm to be used with the CA signature.
/// </param>
/// <returns>
/// A certificate request object containing the same data as the signing request.
/// </returns>
/// <exception cref="FormatException">
/// <paramref name="der"/> is not well-formed.
/// </exception>
/// <exception cref="InvalidOperationException">
/// The request contains unsupported elements.
/// </exception>
/// <exception cref="CryptographicException">
/// The Certification Request signature is invalid.
/// </exception>
/// <remarks>
/// This routine does not perform any sort of operational policy.
/// The caller is responsible for verifying that only valid extensions
/// are used, that the subject name is appropriate, and any other operational
/// concerns.
/// </remarks>
internal static CertificateRequest DecodeDer(
ReadOnlyMemory<byte> der,
HashAlgorithmName signatureHashAlgorithm)
{
AsnReader reader = new AsnReader(der, AsnEncodingRules.DER);
AsnReader certificationRequest = reader.ReadSequence();
reader.ThrowIfNotEmpty();
byte[] encodedRequestInfo = certificationRequest.PeekEncodedValue().ToArray();
AsnReader certificationRequestInfo = certificationRequest.ReadSequence();
AsnReader algorithm = certificationRequest.ReadSequence();
byte[] signature = certificationRequest.ReadBitString(out int unused);
if (unused != 0)
{
throw new InvalidOperationException("The signature was not complete bytes.");
}
certificationRequest.ThrowIfNotEmpty();
string algorithmOid = algorithm.ReadObjectIdentifier();
HashAlgorithmName hashAlg;
RSASignaturePadding signaturePadding = RSASignaturePadding.Pkcs1;
// This only supports RSA.
// Other algorithms could be added.
switch (algorithmOid)
{
case "1.2.840.113549.1.1.5":
hashAlg = HashAlgorithmName.SHA1;
break;
case "1.2.840.113549.1.1.11":
hashAlg = HashAlgorithmName.SHA256;
break;
case "1.2.840.113549.1.1.12":
hashAlg = HashAlgorithmName.SHA384;
break;
case "1.2.840.113549.1.1.13":
hashAlg = HashAlgorithmName.SHA512;
break;
default:
throw new InvalidOperationException(
$"No support for signature algorithm '{algorithmOid}'");
}
// Since only RSA-SSA-PKCS1 made it here, we know the parameters are missing, or NULL.
if (algorithm.HasData)
{
algorithm.ReadNull();
}
algorithm.ThrowIfNotEmpty();
CertificateRequest certReq =
DecodeCertificationRequestInfo(certificationRequestInfo, signatureHashAlgorithm);
RSA pubKey = GetRSA(certReq.PublicKey);
if (pubKey == null)
{
throw new InvalidOperationException("Requested public key was not an RSA key.");
}
if (!pubKey.VerifyData(encodedRequestInfo, signature, hashAlg, signaturePadding))
{
throw new CryptographicException();
}
return certReq;
}
private static CertificateRequest DecodeCertificationRequestInfo(
AsnReader certReqInfo,
HashAlgorithmName signatureHashAlgorithm)
{
//https://tools.ietf.org/html/rfc2986#section-4.1
// CertificationRequestInfo::= SEQUENCE {
// version INTEGER { v1(0) } (v1, ...),
// subject Name,
// subjectPKInfo SubjectPublicKeyInfo{ { PKInfoAlgorithms } },
// attributes[0] Attributes{ { CRIAttributes } }
// }
// As of Sept 2020, there's not a V2 request format.
if (!certReqInfo.TryReadInt32(out int version) || version != 0)
{
throw new InvalidOperationException("Only V1 requests are supported.");
}
byte[] encodedSubject = certReqInfo.ReadEncodedValue().ToArray();
X500DistinguishedName subject = new X500DistinguishedName(encodedSubject);
AsnReader spki = certReqInfo.ReadSequence();
AsnReader reqAttrs =certReqInfo.ReadSetOf(new Asn1Tag(TagClass.ContextSpecific, 0));
certReqInfo.ThrowIfNotEmpty();
// https://tools.ietf.org/html/rfc3280#section-4.1
// SubjectPublicKeyInfo::= SEQUENCE {
// algorithm AlgorithmIdentifier,
// subjectPublicKey BIT STRING
// }
AsnReader pubKeyAlg = spki.ReadSequence();
string algOid = pubKeyAlg.ReadObjectIdentifier();
byte[] algParams;
if (pubKeyAlg.HasData)
{
algParams = pubKeyAlg.ReadEncodedValue().ToArray();
pubKeyAlg.ThrowIfNotEmpty();
}
else
{
algParams = new byte[] { 0x05, 0x00 };
}
byte[] keyBytes = spki.ReadBitString(out int unusedBitCount);
if (unusedBitCount != 0)
{
throw new InvalidOperationException(
"The subjectPublicKey field was not made of full bytes.");
}
PublicKey publicKey = new PublicKey(
new Oid(algOid, null),
new AsnEncodedData(algParams),
new AsnEncodedData(keyBytes));
CertificateRequest request = new CertificateRequest(
subject,
publicKey,
signatureHashAlgorithm);
if (reqAttrs.HasData)
{
// This decode routine only supports one extension: the PKCS#9 extensionRequest
// https://tools.ietf.org/html/rfc2985
// extensionRequest ATTRIBUTE ::= {
// WITH SYNTAX ExtensionRequest
// SINGLE VALUE TRUE
// ID pkcs-9-at-extensionRequest
// }
//
// ExtensionRequest::= Extensions
// https://www.itu.int/ITU-T/formal-language/itu-t/x/x501/2012/InformationFramework.html
// Attribute{ATTRIBUTE: SupportedAttributes} ::= SEQUENCE {
// type ATTRIBUTE.&id({SupportedAttributes}),
// values SET SIZE(0..MAX) OF ATTRIBUTE.&Type({SupportedAttributes}{#type}),
// valuesWithContext SIZE(1..MAX) OF
// SEQUENCE {
// value ATTRIBUTE.&Type({SupportedAttributes}{#type}),
// contextList SET SIZE(1..MAX) OF Context,
// ...
// } OPTIONAL,
// ...
// }
// https://tools.ietf.org/html/rfc5280#section-4.1
// Extensions::= SEQUENCE SIZE(1..MAX) OF Extension
//
// Extension::= SEQUENCE {
// extnID OBJECT IDENTIFIER,
// critical BOOLEAN DEFAULT FALSE,
// extnValue OCTET STRING
// --contains the DER encoding of an ASN.1 value
// --corresponding to the extension type identified
// --by extnID
// }
AsnReader attribute = reqAttrs.ReadSequence();
string attrType = attribute.ReadObjectIdentifier();
AsnReader attrValues = attribute.ReadSetOf();
if (attrType != "1.2.840.113549.1.9.14")
{
throw new InvalidOperationException(
$"Certification Request attribute '{attrType}' is not supported.");
}
// No contexts are defined for the extensionRequest attribute,
// so valuesWithContext can't exist.
attribute.ThrowIfNotEmpty();
// The attribute is single-value, so it must be present
// and there mustn't be a second one.
AsnReader extensions = attrValues.ReadSequence();
attrValues.ThrowIfNotEmpty();
while (extensions.HasData)
{
AsnReader extension = extensions.ReadSequence();
string extnId = extension.ReadObjectIdentifier();
bool critical = false;
byte[] extnValue;
if (extension.PeekTag().HasSameClassAndValue(Asn1Tag.Boolean))
{
critical = extension.ReadBoolean();
}
extnValue = extension.ReadOctetString();
extension.ThrowIfNotEmpty();
X509Extension ext = new X509Extension(
extnId,
extnValue,
critical);
if (CryptoConfig.CreateFromName(extnId) is X509Extension typedExtn)
{
typedExtn.CopyFrom(ext);
ext = typedExtn;
}
request.CertificateExtensions.Add(ext);
}
}
return request;
}
private static RSA GetRSA(PublicKey certReqPublicKey)
{
try
{
return certReqPublicKey.Key as RSA;
}
catch (CryptographicException)
{
}
catch (PlatformNotSupportedException)
{
}
// The try will fail on .NET Framework with any RSA key whose public exponent
// is bigger than uint.MaxValue, because RSACryptoServiceProvider (Windows CAPI)
// doesn't support them.
if (certReqPublicKey.Oid.Value != "1.2.840.113549.1.1.1")
{
throw new InvalidOperationException(
$"The public key algorithm '{certReqPublicKey.Oid.Value}' is not supported.");
}
byte[] encodedParams = certReqPublicKey.EncodedParameters.RawData;
if (encodedParams != null && encodedParams.Length != 0)
{
if (encodedParams.Length != 2 ||
encodedParams[0] != 0x05 ||
encodedParams[1] != 0x00)
{
throw new InvalidOperationException(
"Invalid algorithm parameters for an RSA key.");
}
}
AsnReader encodedKey = new AsnReader(
certReqPublicKey.EncodedKeyValue.RawData,
AsnEncodingRules.DER);
// https://tools.ietf.org/html/rfc3447#appendix-A.1.1
// RSAPublicKey::= SEQUENCE {
// modulus INTEGER, --n
// publicExponent INTEGER --e
// }
AsnReader rsaPublicKey = encodedKey.ReadSequence();
BigInteger modulus = rsaPublicKey.ReadInteger();
BigInteger publicExponent = rsaPublicKey.ReadInteger();
rsaPublicKey.ThrowIfNotEmpty();
byte[] n = modulus.ToByteArray();
byte[] e = publicExponent.ToByteArray();
if (n[n.Length - 1] == 0)
{
Array.Resize(ref n, n.Length - 1);
}
if (e[e.Length - 1] == 0)
{
Array.Resize(ref e, e.Length - 1);
}
Array.Reverse(n);
Array.Reverse(e);
RSAParameters rsaParameters = new RSAParameters
{
Modulus = n,
Exponent = e,
};
RSACng rsaCng = new RSACng();
rsaCng.ImportParameters(rsaParameters);
return rsaCng;
}
}
Update (2022-11-29)
.NET 7 has added the ability to load CSRs, via CertificateRequest.LoadSigningRequest and a PEM-input variant (LoadSigningRequestPem).

The input does not contain any JSON tokens. Expected the input to start with a valid JSON token on LIVE Web API but works fine locally

I have this very odd issue I detail here:
https://github.com/NextGenSoftwareUK/Our-World-OASIS-API-HoloNET-HoloUnity-And-.NET-HDK/issues/14
Which I have copied below:
URI: http://oasisplatform.world/api/avatar/authenticate
Post Body Data:
{
"email": "davidellams#hotmail.com",
"password": "my-super-secret-password"
}
Use Postman or something similar to send the above and you will get back this:
{"type":"https://tools.ietf.org/html/rfc7231#section-6.5.1","title":"One or more validation errors occurred.","status":400,"traceId":"|a8a2c5a-4e9cfc5e41a08a61.","errors":{"$":["The input does not contain any JSON tokens. Expected the input to start with a valid JSON token, when isFinalBlock is true. Path: $ | LineNumber: 0 | BytePositionInLine: 0."],"setGlobally":["The value 'authenticate' is not valid."],"providerType":["The value 'avatar' is not valid."]}}
setGlobally and providerType are optional params for the authenticate method.
/// <summary>
/// Authenticate and log in using the given avatar credentials. Pass in the provider you wish to use. Set the setglobally flag to false for this provider to be used only for this request or true for it to be used for all future requests too.
/// </summary>
/// <param name="model"></param>
/// <param name="providerType"></param>
/// <param name="setGlobally"></param>
/// <returns></returns>
[HttpPost("authenticate/{providerType}/{setGlobally}")]
public ActionResult<AuthenticateResponse> Authenticate(AuthenticateRequest model, ProviderType providerType = ProviderType.Default, bool setGlobally = false)
{
GetAndActivateProvider(providerType, setGlobally);
return Authenticate(model);
AuthenticateResponse response = _avatarService.Authenticate(model, ipAddress());
if (!response.IsError && response.Avatar != null)
setTokenCookie(response.Avatar.RefreshToken);
return Ok(response);
}
/// <summary>
/// Authenticate and log in using the given avatar credentials.
/// </summary>
/// <param name="model"></param>
/// <returns></returns>
[HttpPost("authenticate")]
public ActionResult<AuthenticateResponse> Authenticate(AuthenticateRequest model)
{
AuthenticateResponse response = _avatarService.Authenticate(model, ipAddress());
if (!response.IsError && response.Avatar != null)
setTokenCookie(response.Avatar.RefreshToken);
return Ok(response);
}
Full code here:
https://github.com/NextGenSoftwareUK/Our-World-OASIS-API-HoloNET-HoloUnity-And-.NET-HDK/blob/master/NextGenSoftware.OASIS.API.ONODE.WebAPI/Controllers/AvatarController.cs
NOTE: IT LOOKS LIKE THIS IS ALSO AN ISSUE FOR ALL METHODS ON THE WEB API SO SOMETHING DEFINTLEY GOING ON WITH THE DEPLOYMENT? It works locally fine so I can't work out what is going on?
This may also be related to another issue that started occurring out of the blue after no code had been changed:
https://github.com/NextGenSoftwareUK/Our-World-OASIS-API-HoloNET-HoloUnity-And-.NET-HDK/issues/15
Any help would be really appreciated thanks. :)

"The read session is not available for the input session token." exception

I'm having a problem on Azure DocumentDB with a single partion collection.
Whenever I try to programmatically insert or query any document, I get an exception with the message saying
"The read session is not available for the input session token."
As this collection was newly created, I thought this was a generic error and I tried to recreate the collection on another database, but then when trying to create the collection I can't submit the deploy because I get asked of the partition key.
error
Standing on what the documentation says,
"You do not have to specify a partition key for these collections."
Can someone help? Am I doing something wrong?
The region is West Europe (in case it helps)
For the error you're getting about the input session token, can you add your code here?
For the issue in the portal where you're trying to create a collection, do the following:
In the partition key box, enter space and then press delete, you should get a green check mark in the box.
This will be fixed in the portal shortly.
I assume from your code that you are trying to create a generic pagination logic. From my experience with DocDB, pagination needs to be achieved by using the Continuation Token.
I generally have an extension that obtains said token and then I use it on subsequent requests like so:
/// <summary>
/// Paged results with continuation token
/// </summary>
/// <typeparam name="T"></typeparam>
public class PagedResults<T>
{
public PagedResults()
{
Results = new List<T>();
}
/// <summary>
/// Continuation Token for DocumentDB
/// </summary>
public string ContinuationToken { get; set; }
/// <summary>
/// Results
/// </summary>
public List<T> Results { get; set; }
}
/// <summary>
/// Creates a pagination wrapper with Continuation Token support
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="source"></param>
/// <returns></returns>
public static async Task<PagedResults<T>> ToPagedResults<T>(this IQueryable<T> source)
{
var documentQuery = source.AsDocumentQuery();
var results = new PagedResults<T>();
try
{
var queryResult = await documentQuery.ExecuteNextAsync<T>();
if (!queryResult.Any())
{
return results;
}
results.ContinuationToken = queryResult.ResponseContinuation;
results.Results.AddRange(queryResult);
}
catch
{
//documentQuery.ExecuteNextAsync might throw an Exception if there are no results
return results;
}
return results;
}
You can use this helper in your code along with the FeedOptions:
var feedOptions = new FeedOptions() { MaxItemCount = sizeOfPage };
var collectionUri = UriFactory.CreateDocumentCollectionUri(DatabaseId, CollectionId);
PagedResults<T> results = await client.CreateDocumentQuery<T>(collectionUri,feedOptions).Where(predicate).ToPagedResults();
//You can check of the ContinuationToken and use it on another query
if(!string.IsNullOrEmpty(results.ContinuationToken)){
feedOptions.RequestContinuation = results.ContinuationToken;
PagedResults<T> moreResults = await client.CreateDocumentQuery<T>( collectionUri,feedOptions ).Where(predicate).ToPagedResults();
}
Also, I maintain a repo on Github that contains helpers and providers for DocDB that you are free to use if you want, most are based on the Performance guidelines article and personal experience.
Another word of advice, try to update your SDK to the latest version, either the .Net Full framework or the .Net Core version (depending on your project).

Create a plugin to decrypt custom field of Case Entity while Retrieve in MS Dynamics CRM Online

I have a requirement that to Encrypt the custom Field and Decrypt Automatically while viewing the case in MS Dynamics CRM online Portal.
I created two Plugins one is for Encrypt at PreCaseCreate and the other is to Decrypt at PostCaseRetrieve.The Plugin for Encryption is Working fine,but plugin for decrypt is not working(which means the encrypted content is not decrypting while viewing in online portal).
Below is the code for decryption
// <copyright file="PostCaseRetrieve.cs" company="">
// Copyright (c) 2016 All Rights Reserved
// </copyright>
// <author></author>
// <date>4/20/2016 1:58:24 AM</date>
// <summary>Implements the PostCaseRetrieve Plugin.</summary>
// <auto-generated>
// This code was generated by a tool.
// Runtime Version:4.0.30319.1
// </auto-generated>
namespace CRMCaseEntityDecryptPlugin.Plugins
{
using System;
using System.ServiceModel;
using Microsoft.Xrm.Sdk;
using System.Text;
using System.Security.Cryptography;
using Microsoft.Xrm.Sdk.Query;
/// <summary>
/// PostCaseRetrieve Plugin.
/// </summary>
public class PostCaseRetrieve : Plugin
{
/// <summary>
/// Initializes a new instance of the <see cref="PostCaseRetrieve"/> class.
/// </summary>
public PostCaseRetrieve()
: base(typeof(PostCaseRetrieve))
{
base.RegisteredEvents.Add(new Tuple<int, string, string, Action<LocalPluginContext>>(40, "Retrieve", "incident", new Action<LocalPluginContext>(ExecutePostCaseRetrieve)));
// Note : you can register for more events here if this plugin is not specific to an individual entity and message combination.
// You may also need to update your RegisterFile.crmregister plug-in registration file to reflect any change.
}
/// <summary>
/// Executes the plug-in.
/// </summary>
/// <param name="localContext">The <see cref="LocalPluginContext"/> which contains the
/// <see cref="IPluginExecutionContext"/>,
/// <see cref="IOrganizationService"/>
/// and <see cref="ITracingService"/>
/// </param>
/// <remarks>
/// For improved performance, Microsoft Dynamics CRM caches plug-in instances.
/// The plug-in's Execute method should be written to be stateless as the constructor
/// is not called for every invocation of the plug-in. Also, multiple system threads
/// could execute the plug-in at the same time. All per invocation state information
/// is stored in the context. This means that you should not use global variables in plug-ins.
/// </remarks>
protected void ExecutePostCaseRetrieve(LocalPluginContext localContext)
{
if (localContext == null)
{
throw new ArgumentNullException("localContext");
}
// TODO: Implement your custom Plug-in business logic.
IPluginExecutionContext context = localContext.PluginExecutionContext;
IOrganizationService service = localContext.OrganizationService;
// The InputParameters collection contains all the data passed in the message request.
if (context.InputParameters.Contains("Target") && context.InputParameters["Target"] is Entity)
{
// Obtain the target entity from the input parmameters.
Entity entity = (Entity)context.InputParameters["Target"];
if (entity.LogicalName.ToLower().Equals("incident"))
{
try
{
ColumnSet cols = new ColumnSet(new String[] { "title", "description", "new_phicontent" });
var incident = service.Retrieve("incident", entity.Id, cols);
if (incident.Attributes.Contains("new_phicontent"))
{
string PHIContent = incident.Attributes["new_phicontent"].ToString();
byte[] bInput = Convert.FromBase64String(PHIContent);
UTF8Encoding UTF8 = new UTF8Encoding();
//Encrypt/Decrypt strings which in turn uses 3DES (Triple Data Encryption standard) algorithm
TripleDESCryptoServiceProvider tripledescryptoserviceprovider = new TripleDESCryptoServiceProvider();
//Alow to compute a hash value for Encryption/Decryption
MD5CryptoServiceProvider md5cryptoserviceprovider = new MD5CryptoServiceProvider();
tripledescryptoserviceprovider.Key = md5cryptoserviceprovider.ComputeHash(ASCIIEncoding.ASCII.GetBytes("secretkey"));
tripledescryptoserviceprovider.Mode = CipherMode.ECB;
ICryptoTransform icryptotransform = tripledescryptoserviceprovider.CreateDecryptor();
string DecryptedText = UTF8.GetString(icryptotransform.TransformFinalBlock(bInput, 0, bInput.Length));
incident["new_phicontent"] = DecryptedText;
service.Update(incident);
}
}
catch (FaultException ex)
{
throw new InvalidPluginExecutionException("An error occurred in the plug-in.", ex);
}
}
}
}
}
}
I tried with PreCaseRetrieve event also,but i didn't got result
Kindly Provide some solution to resolve this.
Thanks in advance
Leave your plugin as a post plugin.
Target object from InputParameters is the object that is sent to the client, so if you modify the target object, you modify what is sent to the client. So don't retrieve the incident and then update incident. Instead, if entity contains the new_phicontent attribute, then you know the client requested the attribute and it needs to be decrypted so decrypt the value and then update entity["new_phicontent"]. Here's the updated code:
// Obtain the target entity from the input parmameters.
Entity entity = (Entity)context.InputParameters["Target"];
if (entity.LogicalName.ToLower().Equals("incident"))
{
try
{
if (entity.Attributes.Contains("new_phicontent"))
{
string PHIContent = entity.Attributes["new_phicontent"];
byte[] bInput = Convert.FromBase64String(PHIContent);
// removed for brevity
string decryptedText = UTF8.GetString(icryptotransform.TransformFinalBlock(bInput, 0, bInput.Length));
entity["new_phicontent"] = decryptedText;
}
}
catch (FaultException ex)
{
throw new InvalidPluginExecutionException("An error occurred in the plug-in.", ex);
}
}

Changing logged in user from a drop-down list for role testing

Authentication is set to Windows and roleManager is using AspNetWindowsTokenRoleProvider. This is an intranet application.
I've created a couple of different test accounts (local to the web server) to test the different roles of the application. Each user belongs to a different local group, and each group is a different role in the application. I would like to provide a selection option for the user to switch between the accounts.
Ideally I would like a drop-down box on the web page with the following types of options for the user to select:
WindowsUser (User currently logged into windows or challenge for username/password if this can't be figured out)
TestUserForRole1
TestUserForRole2
TestUserForRole3
If this is not possible, the next best thing would be to prompt the user to enter in a username/password and authenticate them on the server to establish a new sessions with the provided credentials. What are my options? Thanks.
I think I figured this one out. My solution is to set a cookie with the selection of the drop-down box and refresh the page. Read the cookie value on Application_AuthenticateRequest event and replace the HttpContext.User with the requested user if needed. Here's the code:
Select the user like this:
<select id="LoginUser" name="LoginUser" onchange="ChangeLoginUser(this)">
<option value="NONE">-- Change User --</option>
<option value="">Windows User</option>
<option value="UserOne">User One</option>
<option value="UserTwo">User Two</option>
</select>
Handle the user selection with javascript (AppPath reference)
function ChangeLoginUser(sel) {
var selectedUser = sel.options[sel.selectedIndex].value;
if (selectedUser == "NONE") return;
$.cookie("LoginUser", selectedUser, { path: AppPath });
location.reload(true); //refresh
}
Override currently logged in windows user with selection:
protected void Application_AuthenticateRequest(Object sender, EventArgs e)
{
OverrideLoginUser();
}
private void OverrideLoginUser()
{
HttpCookie authCookie = Context.Request.Cookies["LoginUser"];
if (authCookie == null || string.IsNullOrWhiteSpace(authCookie.Value))
return; //regular authentication
if (User != null && User.Username().Equals(authCookie.Value))
return; //already set
Context.User = GetTestUser(authCookie.Value);
}
private WindowsPrincipal GetTestUser(string user)
{
WindowsPrincipal testuser = null;
IntPtr hToken;
if (WinSec.LogonUser(user, "", "ThePassword", // all users have the same password
(int)LogonType.LOGON32_LOGON_INTERACTIVE, (int)LogonProvider.LOGON32_PROVIDER_DEFAULT, out hToken))
{
testuser = new WindowsPrincipal(new WindowsIdentity(hToken, "WindowsAuthentication"));
}
if (hToken != IntPtr.Zero) WinSec.CloseHandle(hToken);
}
if (testuser == null)
throw new Exception("Error getting test user");
return testuser;
}
Username() extension
public static string Username(this IPrincipal user)
{
var name = user.Identity.Name;
//remove domain name
name = Regex.Replace(name, ".*\\\\(.*)", "$1", RegexOptions.None);
return name;
}
UserLogon function needs to be made available by the following class
/// <summary>
/// http://pinvoke.net/default.aspx/advapi32.LogonUser
/// </summary>
public class WinSec
{
[DllImport("advapi32.dll", SetLastError = true)]
public static extern bool LogonUser(
string lpszUsername,
string lpszDomain,
string lpszPassword,
int dwLogonType,
int dwLogonProvider,
out IntPtr phToken
);
[DllImport("advapi32.dll", SetLastError = true)]
public extern static bool DuplicateToken(IntPtr ExistingTokenHandle, int SECURITY_IMPERSONATION_LEVEL, out IntPtr DuplicateTokenHandle);
[DllImport("kernel32.dll", SetLastError = true)]
[return: MarshalAs(UnmanagedType.Bool)]
public static extern bool CloseHandle(IntPtr hObject);
}
public enum LogonType
{
/// <summary>
/// This logon type is intended for users who will be interactively using the computer, such as a user being logged on
/// by a terminal server, remote shell, or similar process.
/// This logon type has the additional expense of caching logon information for disconnected operations;
/// therefore, it is inappropriate for some client/server applications,
/// such as a mail server.
/// </summary>
LOGON32_LOGON_INTERACTIVE = 2,
/// <summary>
/// This logon type is intended for high performance servers to authenticate plaintext passwords.
/// The LogonUser function does not cache credentials for this logon type.
/// </summary>
LOGON32_LOGON_NETWORK = 3,
/// <summary>
/// This logon type is intended for batch servers, where processes may be executing on behalf of a user without
/// their direct intervention. This type is also for higher performance servers that process many plaintext
/// authentication attempts at a time, such as mail or Web servers.
/// The LogonUser function does not cache credentials for this logon type.
/// </summary>
LOGON32_LOGON_BATCH = 4,
/// <summary>
/// Indicates a service-type logon. The account provided must have the service privilege enabled.
/// </summary>
LOGON32_LOGON_SERVICE = 5,
/// <summary>
/// This logon type is for GINA DLLs that log on users who will be interactively using the computer.
/// This logon type can generate a unique audit record that shows when the workstation was unlocked.
/// </summary>
LOGON32_LOGON_UNLOCK = 7,
/// <summary>
/// This logon type preserves the name and password in the authentication package, which allows the server to make
/// connections to other network servers while impersonating the client. A server can accept plaintext credentials
/// from a client, call LogonUser, verify that the user can access the system across the network, and still
/// communicate with other servers.
/// NOTE: Windows NT: This value is not supported.
/// </summary>
LOGON32_LOGON_NETWORK_CLEARTEXT = 8,
/// <summary>
/// This logon type allows the caller to clone its current token and specify new credentials for outbound connections.
/// The new logon session has the same local identifier but uses different credentials for other network connections.
/// NOTE: This logon type is supported only by the LOGON32_PROVIDER_WINNT50 logon provider.
/// NOTE: Windows NT: This value is not supported.
/// </summary>
LOGON32_LOGON_NEW_CREDENTIALS = 9,
}
public enum LogonProvider
{
/// <summary>
/// Use the standard logon provider for the system.
/// The default security provider is negotiate, unless you pass NULL for the domain name and the user name
/// is not in UPN format. In this case, the default provider is NTLM.
/// NOTE: Windows 2000/NT: The default security provider is NTLM.
/// </summary>
LOGON32_PROVIDER_DEFAULT = 0,
LOGON32_PROVIDER_WINNT35 = 1,
LOGON32_PROVIDER_WINNT40 = 2,
LOGON32_PROVIDER_WINNT50 = 3
}
public enum SecurityImpersonationLevel
{
/// <summary>
/// The server process cannot obtain identification information about the client,
/// and it cannot impersonate the client. It is defined with no value given, and thus,
/// by ANSI C rules, defaults to a value of zero.
/// </summary>
SecurityAnonymous = 0,
/// <summary>
/// The server process can obtain information about the client, such as security identifiers and privileges,
/// but it cannot impersonate the client. This is useful for servers that export their own objects,
/// for example, database products that export tables and views.
/// Using the retrieved client-security information, the server can make access-validation decisions without
/// being able to use other services that are using the client's security context.
/// </summary>
SecurityIdentification = 1,
/// <summary>
/// The server process can impersonate the client's security context on its local system.
/// The server cannot impersonate the client on remote systems.
/// </summary>
SecurityImpersonation = 2,
/// <summary>
/// The server process can impersonate the client's security context on remote systems.
/// NOTE: Windows NT: This impersonation level is not supported.
/// </summary>
SecurityDelegation = 3,
}

Resources