JSON decoder : Object reference not set to an instance of an object - biztalk

I get the following error on a JSON receive pipeline when calling a REST Get API :
There was a failure executing the response(receive) pipeline:
"mycustomPiepeline,mycustomPieplelineAssembly, Version=1.0.0.0, Culture=neutral, PublicKeyToken=70f46ad2a5c6e8c0"
Source: "JSON decoder" Send Port: "My webHttp send port" URI: "URL"
Reason: Object reference not set to an instance of an object.
and also from an Insomnia call the response is : null
I found this solution from Mark's blog by using BRE pipeline framework, but I'm not using BRE.
I think to create a custom component pipeline to replace the null response with an empty body, is there any better suggestions?
I'm using BTS 2016 CU4

There is a possible fix from Microsoft FIX: WCF-WebHTTP Two-Way Send Response responds with an empty message and causes the JSON decoder to fail in BizTalk Server but that was in CU2 for 2016, so it looks like they didn't solve the issue 100% unless you didn't populate the AddMessageBodyForEmptyMessage property.
Mark's solution is more to do with some elements in a JSON payload created by the JSON Encoder having a null value, which itself is a work around for a bug in the JSON Encoder that is changing an empty string into a null.
If you don't want to use the BRE Pipeline Component (which hopefully will be available for BizTalk 2016 very soon as I know it is being worked on), then you can roll your own e.g. below, where InsertInEmpty is a parameter on the Pipeline Component that you can set what message to return if you get a empty body.
#region IComponent members
/// <summary>
/// Implements IComponent.Execute method.
/// </summary>
/// <param name="pc">Pipeline context</param>
/// <param name="inmsg">Input message</param>
/// <returns>Original input message</returns>
/// <remarks>
/// IComponent.Execute method is used to initiate
/// the processing of the message in this pipeline component.
/// </remarks>
public Microsoft.BizTalk.Message.Interop.IBaseMessage Execute(Microsoft.BizTalk.Component.Interop.IPipelineContext pc, Microsoft.BizTalk.Message.Interop.IBaseMessage inmsg)
{
string dataOut = "";
StreamReader sr = new StreamReader(inmsg.BodyPart.Data);
if (InsertInEmpty != "" & inmsg.BodyPart.Data.Length == 0)
{
dataOut = InsertInEmpty;
}
else
{
dataOut = dataOut + sr.ReadToEnd();
}
MemoryStream ms = new System.IO.MemoryStream(System.Text.Encoding.ASCII.GetBytes(dataOut));
inmsg.BodyPart.Data = ms;
inmsg.BodyPart.Data.Position = 0;
return inmsg;
}
#endregion

Related

The input does not contain any JSON tokens. Expected the input to start with a valid JSON token on LIVE Web API but works fine locally

I have this very odd issue I detail here:
https://github.com/NextGenSoftwareUK/Our-World-OASIS-API-HoloNET-HoloUnity-And-.NET-HDK/issues/14
Which I have copied below:
URI: http://oasisplatform.world/api/avatar/authenticate
Post Body Data:
{
"email": "davidellams#hotmail.com",
"password": "my-super-secret-password"
}
Use Postman or something similar to send the above and you will get back this:
{"type":"https://tools.ietf.org/html/rfc7231#section-6.5.1","title":"One or more validation errors occurred.","status":400,"traceId":"|a8a2c5a-4e9cfc5e41a08a61.","errors":{"$":["The input does not contain any JSON tokens. Expected the input to start with a valid JSON token, when isFinalBlock is true. Path: $ | LineNumber: 0 | BytePositionInLine: 0."],"setGlobally":["The value 'authenticate' is not valid."],"providerType":["The value 'avatar' is not valid."]}}
setGlobally and providerType are optional params for the authenticate method.
/// <summary>
/// Authenticate and log in using the given avatar credentials. Pass in the provider you wish to use. Set the setglobally flag to false for this provider to be used only for this request or true for it to be used for all future requests too.
/// </summary>
/// <param name="model"></param>
/// <param name="providerType"></param>
/// <param name="setGlobally"></param>
/// <returns></returns>
[HttpPost("authenticate/{providerType}/{setGlobally}")]
public ActionResult<AuthenticateResponse> Authenticate(AuthenticateRequest model, ProviderType providerType = ProviderType.Default, bool setGlobally = false)
{
GetAndActivateProvider(providerType, setGlobally);
return Authenticate(model);
AuthenticateResponse response = _avatarService.Authenticate(model, ipAddress());
if (!response.IsError && response.Avatar != null)
setTokenCookie(response.Avatar.RefreshToken);
return Ok(response);
}
/// <summary>
/// Authenticate and log in using the given avatar credentials.
/// </summary>
/// <param name="model"></param>
/// <returns></returns>
[HttpPost("authenticate")]
public ActionResult<AuthenticateResponse> Authenticate(AuthenticateRequest model)
{
AuthenticateResponse response = _avatarService.Authenticate(model, ipAddress());
if (!response.IsError && response.Avatar != null)
setTokenCookie(response.Avatar.RefreshToken);
return Ok(response);
}
Full code here:
https://github.com/NextGenSoftwareUK/Our-World-OASIS-API-HoloNET-HoloUnity-And-.NET-HDK/blob/master/NextGenSoftware.OASIS.API.ONODE.WebAPI/Controllers/AvatarController.cs
NOTE: IT LOOKS LIKE THIS IS ALSO AN ISSUE FOR ALL METHODS ON THE WEB API SO SOMETHING DEFINTLEY GOING ON WITH THE DEPLOYMENT? It works locally fine so I can't work out what is going on?
This may also be related to another issue that started occurring out of the blue after no code had been changed:
https://github.com/NextGenSoftwareUK/Our-World-OASIS-API-HoloNET-HoloUnity-And-.NET-HDK/issues/15
Any help would be really appreciated thanks. :)

"The read session is not available for the input session token." exception

I'm having a problem on Azure DocumentDB with a single partion collection.
Whenever I try to programmatically insert or query any document, I get an exception with the message saying
"The read session is not available for the input session token."
As this collection was newly created, I thought this was a generic error and I tried to recreate the collection on another database, but then when trying to create the collection I can't submit the deploy because I get asked of the partition key.
error
Standing on what the documentation says,
"You do not have to specify a partition key for these collections."
Can someone help? Am I doing something wrong?
The region is West Europe (in case it helps)
For the error you're getting about the input session token, can you add your code here?
For the issue in the portal where you're trying to create a collection, do the following:
In the partition key box, enter space and then press delete, you should get a green check mark in the box.
This will be fixed in the portal shortly.
I assume from your code that you are trying to create a generic pagination logic. From my experience with DocDB, pagination needs to be achieved by using the Continuation Token.
I generally have an extension that obtains said token and then I use it on subsequent requests like so:
/// <summary>
/// Paged results with continuation token
/// </summary>
/// <typeparam name="T"></typeparam>
public class PagedResults<T>
{
public PagedResults()
{
Results = new List<T>();
}
/// <summary>
/// Continuation Token for DocumentDB
/// </summary>
public string ContinuationToken { get; set; }
/// <summary>
/// Results
/// </summary>
public List<T> Results { get; set; }
}
/// <summary>
/// Creates a pagination wrapper with Continuation Token support
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="source"></param>
/// <returns></returns>
public static async Task<PagedResults<T>> ToPagedResults<T>(this IQueryable<T> source)
{
var documentQuery = source.AsDocumentQuery();
var results = new PagedResults<T>();
try
{
var queryResult = await documentQuery.ExecuteNextAsync<T>();
if (!queryResult.Any())
{
return results;
}
results.ContinuationToken = queryResult.ResponseContinuation;
results.Results.AddRange(queryResult);
}
catch
{
//documentQuery.ExecuteNextAsync might throw an Exception if there are no results
return results;
}
return results;
}
You can use this helper in your code along with the FeedOptions:
var feedOptions = new FeedOptions() { MaxItemCount = sizeOfPage };
var collectionUri = UriFactory.CreateDocumentCollectionUri(DatabaseId, CollectionId);
PagedResults<T> results = await client.CreateDocumentQuery<T>(collectionUri,feedOptions).Where(predicate).ToPagedResults();
//You can check of the ContinuationToken and use it on another query
if(!string.IsNullOrEmpty(results.ContinuationToken)){
feedOptions.RequestContinuation = results.ContinuationToken;
PagedResults<T> moreResults = await client.CreateDocumentQuery<T>( collectionUri,feedOptions ).Where(predicate).ToPagedResults();
}
Also, I maintain a repo on Github that contains helpers and providers for DocDB that you are free to use if you want, most are based on the Performance guidelines article and personal experience.
Another word of advice, try to update your SDK to the latest version, either the .Net Full framework or the .Net Core version (depending on your project).

Create a plugin to decrypt custom field of Case Entity while Retrieve in MS Dynamics CRM Online

I have a requirement that to Encrypt the custom Field and Decrypt Automatically while viewing the case in MS Dynamics CRM online Portal.
I created two Plugins one is for Encrypt at PreCaseCreate and the other is to Decrypt at PostCaseRetrieve.The Plugin for Encryption is Working fine,but plugin for decrypt is not working(which means the encrypted content is not decrypting while viewing in online portal).
Below is the code for decryption
// <copyright file="PostCaseRetrieve.cs" company="">
// Copyright (c) 2016 All Rights Reserved
// </copyright>
// <author></author>
// <date>4/20/2016 1:58:24 AM</date>
// <summary>Implements the PostCaseRetrieve Plugin.</summary>
// <auto-generated>
// This code was generated by a tool.
// Runtime Version:4.0.30319.1
// </auto-generated>
namespace CRMCaseEntityDecryptPlugin.Plugins
{
using System;
using System.ServiceModel;
using Microsoft.Xrm.Sdk;
using System.Text;
using System.Security.Cryptography;
using Microsoft.Xrm.Sdk.Query;
/// <summary>
/// PostCaseRetrieve Plugin.
/// </summary>
public class PostCaseRetrieve : Plugin
{
/// <summary>
/// Initializes a new instance of the <see cref="PostCaseRetrieve"/> class.
/// </summary>
public PostCaseRetrieve()
: base(typeof(PostCaseRetrieve))
{
base.RegisteredEvents.Add(new Tuple<int, string, string, Action<LocalPluginContext>>(40, "Retrieve", "incident", new Action<LocalPluginContext>(ExecutePostCaseRetrieve)));
// Note : you can register for more events here if this plugin is not specific to an individual entity and message combination.
// You may also need to update your RegisterFile.crmregister plug-in registration file to reflect any change.
}
/// <summary>
/// Executes the plug-in.
/// </summary>
/// <param name="localContext">The <see cref="LocalPluginContext"/> which contains the
/// <see cref="IPluginExecutionContext"/>,
/// <see cref="IOrganizationService"/>
/// and <see cref="ITracingService"/>
/// </param>
/// <remarks>
/// For improved performance, Microsoft Dynamics CRM caches plug-in instances.
/// The plug-in's Execute method should be written to be stateless as the constructor
/// is not called for every invocation of the plug-in. Also, multiple system threads
/// could execute the plug-in at the same time. All per invocation state information
/// is stored in the context. This means that you should not use global variables in plug-ins.
/// </remarks>
protected void ExecutePostCaseRetrieve(LocalPluginContext localContext)
{
if (localContext == null)
{
throw new ArgumentNullException("localContext");
}
// TODO: Implement your custom Plug-in business logic.
IPluginExecutionContext context = localContext.PluginExecutionContext;
IOrganizationService service = localContext.OrganizationService;
// The InputParameters collection contains all the data passed in the message request.
if (context.InputParameters.Contains("Target") && context.InputParameters["Target"] is Entity)
{
// Obtain the target entity from the input parmameters.
Entity entity = (Entity)context.InputParameters["Target"];
if (entity.LogicalName.ToLower().Equals("incident"))
{
try
{
ColumnSet cols = new ColumnSet(new String[] { "title", "description", "new_phicontent" });
var incident = service.Retrieve("incident", entity.Id, cols);
if (incident.Attributes.Contains("new_phicontent"))
{
string PHIContent = incident.Attributes["new_phicontent"].ToString();
byte[] bInput = Convert.FromBase64String(PHIContent);
UTF8Encoding UTF8 = new UTF8Encoding();
//Encrypt/Decrypt strings which in turn uses 3DES (Triple Data Encryption standard) algorithm
TripleDESCryptoServiceProvider tripledescryptoserviceprovider = new TripleDESCryptoServiceProvider();
//Alow to compute a hash value for Encryption/Decryption
MD5CryptoServiceProvider md5cryptoserviceprovider = new MD5CryptoServiceProvider();
tripledescryptoserviceprovider.Key = md5cryptoserviceprovider.ComputeHash(ASCIIEncoding.ASCII.GetBytes("secretkey"));
tripledescryptoserviceprovider.Mode = CipherMode.ECB;
ICryptoTransform icryptotransform = tripledescryptoserviceprovider.CreateDecryptor();
string DecryptedText = UTF8.GetString(icryptotransform.TransformFinalBlock(bInput, 0, bInput.Length));
incident["new_phicontent"] = DecryptedText;
service.Update(incident);
}
}
catch (FaultException ex)
{
throw new InvalidPluginExecutionException("An error occurred in the plug-in.", ex);
}
}
}
}
}
}
I tried with PreCaseRetrieve event also,but i didn't got result
Kindly Provide some solution to resolve this.
Thanks in advance
Leave your plugin as a post plugin.
Target object from InputParameters is the object that is sent to the client, so if you modify the target object, you modify what is sent to the client. So don't retrieve the incident and then update incident. Instead, if entity contains the new_phicontent attribute, then you know the client requested the attribute and it needs to be decrypted so decrypt the value and then update entity["new_phicontent"]. Here's the updated code:
// Obtain the target entity from the input parmameters.
Entity entity = (Entity)context.InputParameters["Target"];
if (entity.LogicalName.ToLower().Equals("incident"))
{
try
{
if (entity.Attributes.Contains("new_phicontent"))
{
string PHIContent = entity.Attributes["new_phicontent"];
byte[] bInput = Convert.FromBase64String(PHIContent);
// removed for brevity
string decryptedText = UTF8.GetString(icryptotransform.TransformFinalBlock(bInput, 0, bInput.Length));
entity["new_phicontent"] = decryptedText;
}
}
catch (FaultException ex)
{
throw new InvalidPluginExecutionException("An error occurred in the plug-in.", ex);
}
}

SAML with a SKI instead of certificate in X509Data node

I have already implemented authentication mechanism based on SAML protocol. Project use SAML2 library. Everything worked fine until change on the server has ocurred. The server used to respond with <ds:X509Certificate> node:
<ds:KeyInfo><ds:X509Data><ds:X509Certificate>Here was certificate</ds:X509Certificate></ds:X509Data></ds:KeyInfo>
But it has changed to:
<ds:X509SKI>Here is Subject Key Identifier</ds:X509SKI>
SAML2 library has CheckSignature method which can be applied on server response:
/// <summary>
/// Checks the signature.
/// </summary>
/// <returns>True of the signature is valid, else false.</returns>
public bool CheckSignature()
{
return XmlSignatureUtils.CheckSignature(Document);
}
It points here:
/// <summary>
/// Verifies the signature of the XmlDocument instance using the key enclosed with the signature.
/// </summary>
/// <param name="doc">The doc.</param>
/// <returns><code>true</code> if the document's signature can be verified. <code>false</code> if the signature could
/// not be verified.</returns>
/// <exception cref="InvalidOperationException">if the XmlDocument instance does not contain a signed XML document.</exception>
public static bool CheckSignature(XmlDocument doc)
{
CheckDocument(doc);
var signedXml = RetrieveSignature(doc);
if (signedXml.SignatureMethod.Contains("rsa-sha256"))
{
// SHA256 keys must be obtained from message manually
var trustedCertificates = GetCertificates(doc);
foreach (var cert in trustedCertificates)
{
if (signedXml.CheckSignature(cert.PublicKey.Key))
{
return true;
}
}
return false;
}
return signedXml.CheckSignature();
}
And finally GetCertificates method looks like that:
/// <summary>
/// Gets the certificates.
/// </summary>
/// <param name="doc">The document.</param>
/// <returns>List of <see cref="X509Certificate2"/>.</returns>
private static List<X509Certificate2> GetCertificates(XmlDocument doc)
{
var certificates = new List<X509Certificate2>();
var x509CertificateNodeList = doc.GetElementsByTagName("ds:X509Certificate");
if (x509CertificateNodeList.Count == 0)
{
x509CertificateNodeList = doc.GetElementsByTagName("X509Certificate");
}
foreach (XmlNode xn in x509CertificateNodeList)
{
try
{
var xc = new X509Certificate2(Convert.FromBase64String(xn.InnerText));
certificates.Add(xc);
}
catch
{
// Swallow the certificate parse error
}
}
return certificates;
}
As you can see the library checks only certificates not subject key identifiers. I believe I can implement SKI comparison between installed certificate and provided element on my own, but I'm not sure if this is legal way to do it.
Here Thomas Pornin wrote:
The Subject Key Identifier does not play a role in validation, at
least not in the algorithm which makes up section 6 of RFC 5280. It is
meant to be an help for path building
His statement suggest I can't do validation by comparing SKI from server response and installed certificate.
RFC 5280 Suggests the same, but I don't have enough time to read it carefully, so I'm searching for your help.
Is comparison of subject key identifier of installed X509 certificate and those in SAML response right way to verify response?
No, as already mentioned, SKI is used only to bind certificates in the chain (when key match is used). It doesn't provide enough information about the certificate and its details.
However, if client have a full certificate preinstalled, client could use SKI to locate the right certificate and use the cert for validation procedures.

How do I encrypt URLs in ASP.NET MVC?

I need to Encrypt the URLs in my ASP.NET MVC application.
Do I need to write the code in Global page in Route Collection to Encrypt all the URLs?
It's a bad idea to encrypt a URL. Period.
You may wonder why I say that.
I worked on an application for a company that encrypted its URLs. This was a webforms application. From the URL alone, it was nearly impossible to tell what part of the code I was hitting to cause that issue. Because of the dynamic nature of calling the webform controls, you just had to know the path the software was going to go down. It was quite unnerving.
Add to that that there was no role based authorization in the application. It was all based on the URL being encrypted. If you could decrypt the URL (which if it can be encrypted, it can be decrypted), then you could conceivably enter another encrypted URL and impersonate another user. I'm not saying it's simple, but it can happen.
Finally, how often do you use the internet and see encrypted URLs? When you do, do you die a little inside? I do. URLs are meant to convey public information. If you don't want it to do that, don't put it in your URL (or require Authorization for sensitive areas of your site).
The IDs you're using in the database should be IDs that are ok for the user to see. If you're using an SSN as a primary key, then you should change that schema for a web application.
Anything that can be encrypted can be decrypted, and therefore is vulnerable to attack.
If you want a user to only access certain URLs if they're authorized, then you should use the [Authorize] attributes available in ASP.NET MVC.
Encrypting an entire url, I agree, very bad idea. Encrypting url parameters? Not so much and is actually a valid and widely used technique.
If you really want to encrypt/decrypt url parameters (which isn't a bad idea at all), then check out Mads Kristensen's article "HttpModule for query string encryption".
You will need to modify context_BeginRequest in order to get it to work for MVC. Just remove the first part of the if statement that checks if the original url contains "aspx".
With that said, I have used this module in a couple of projects (have a converted VB version if needed) and for the most part, it works like a charm.
BUT, there are some instances where I have experienced some issues with jQuery/Ajax calls not working correctly. I am sure the module could be modified in order to compensate for those scenarios.
Based on the answers here, which did not work for me BTW, I found another solution based on my particular MVC implementation, and the fact that it also works depending on whether you're using II7 or II6. Slight changes are needed in both cases.
II6
Firstly, you need to add the following into your web.config (root, not the one in View folder).
<system.web>
<httpModules>
<add name="URIHandler" type="URIHandler" />
</httpModules>
II7
add this instead into your web.config (root, not the one in View folder).
<system.webServer>
<validation validateIntegratedModeConfiguration="false" />
<modules runAllManagedModulesForAllRequests="true">
<remove name="URIHandler" />
<add name="URIHandler" type="URIHandler" />
</modules>
Or you could add both. It doesn't matter really.
Next use this class. I called it, as you've probably noticed - URIHandler.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.Mvc;
using System.IO;
using System.Text;
using System.Security.Cryptography;
using System.Diagnostics.CodeAnalysis;
public class URIHandler : IHttpModule
{
#region IHttpModule members
public void Dispose()
{
}
public void Init(HttpApplication context)
{
context.BeginRequest += new EventHandler(context_BeginRequest);
}
#endregion
private const string PARAMETER_NAME = "enc=";
private const string ENCRYPTION_KEY = "key";
private void context_BeginRequest(object sender, EventArgs e)
{
HttpContext context = HttpContext.Current;
//if (context.Request.Url.OriginalString.Contains("aspx") && context.Request.RawUrl.Contains("?"))
if (context.Request.RawUrl.Contains("?"))
{
string query = ExtractQuery(context.Request.RawUrl);
string path = GetVirtualPath();
if (query.StartsWith(PARAMETER_NAME, StringComparison.OrdinalIgnoreCase))
{
// Decrypts the query string and rewrites the path.
string rawQuery = query.Replace(PARAMETER_NAME, string.Empty);
string decryptedQuery = Decrypt(rawQuery);
context.RewritePath(path, string.Empty, decryptedQuery);
}
else if (context.Request.HttpMethod == "GET")
{
// Encrypt the query string and redirects to the encrypted URL.
// Remove if you don't want all query strings to be encrypted automatically.
string encryptedQuery = Encrypt(query);
context.Response.Redirect(path + encryptedQuery);
}
}
}
/// <summary>
/// Parses the current URL and extracts the virtual path without query string.
/// </summary>
/// <returns>The virtual path of the current URL.</returns>
private static string GetVirtualPath()
{
string path = HttpContext.Current.Request.RawUrl;
path = path.Substring(0, path.IndexOf("?"));
path = path.Substring(path.LastIndexOf("/") + 1);
return path;
}
/// <summary>
/// Parses a URL and returns the query string.
/// </summary>
/// <param name="url">The URL to parse.</param>
/// <returns>The query string without the question mark.</returns>
private static string ExtractQuery(string url)
{
int index = url.IndexOf("?") + 1;
return url.Substring(index);
}
#region Encryption/decryption
/// <summary>
/// The salt value used to strengthen the encryption.
/// </summary>
private readonly static byte[] SALT = Encoding.ASCII.GetBytes(ENCRYPTION_KEY.Length.ToString());
/// <summary>
/// Encrypts any string using the Rijndael algorithm.
/// </summary>
/// <param name="inputText">The string to encrypt.</param>
/// <returns>A Base64 encrypted string.</returns>
[SuppressMessage("Microsoft.Usage", "CA2202:Do not dispose objects multiple times")]
public static string Encrypt(string inputText)
{
RijndaelManaged rijndaelCipher = new RijndaelManaged();
byte[] plainText = Encoding.Unicode.GetBytes(inputText);
PasswordDeriveBytes SecretKey = new PasswordDeriveBytes(ENCRYPTION_KEY, SALT);
using (ICryptoTransform encryptor = rijndaelCipher.CreateEncryptor(SecretKey.GetBytes(32), SecretKey.GetBytes(16)))
{
using (MemoryStream memoryStream = new MemoryStream())
{
using (CryptoStream cryptoStream = new CryptoStream(memoryStream, encryptor, CryptoStreamMode.Write))
{
cryptoStream.Write(plainText, 0, plainText.Length);
cryptoStream.FlushFinalBlock();
return "?" + PARAMETER_NAME + Convert.ToBase64String(memoryStream.ToArray());
}
}
}
}
/// <summary>
/// Decrypts a previously encrypted string.
/// </summary>
/// <param name="inputText">The encrypted string to decrypt.</param>
/// <returns>A decrypted string.</returns>
[SuppressMessage("Microsoft.Usage", "CA2202:Do not dispose objects multiple times")]
public static string Decrypt(string inputText)
{
RijndaelManaged rijndaelCipher = new RijndaelManaged();
byte[] encryptedData = Convert.FromBase64String(inputText);
PasswordDeriveBytes secretKey = new PasswordDeriveBytes(ENCRYPTION_KEY, SALT);
using (ICryptoTransform decryptor = rijndaelCipher.CreateDecryptor(secretKey.GetBytes(32), secretKey.GetBytes(16)))
{
using (MemoryStream memoryStream = new MemoryStream(encryptedData))
{
using (CryptoStream cryptoStream = new CryptoStream(memoryStream, decryptor, CryptoStreamMode.Read))
{
byte[] plainText = new byte[encryptedData.Length];
int decryptedCount = cryptoStream.Read(plainText, 0, plainText.Length);
return Encoding.Unicode.GetString(plainText, 0, decryptedCount);
}
}
}
}
#endregion
}
You don't need a NameSpace.
The above class does everything you need to Encrypt and Decrypt any URL parameters starting with '?' character. It even does a nice job of renaming your parameter variables to 'enc', which is a bonus.
Lastly, place the class in your App_Start folder, and NOT the App_Code folder, as that will conflict with 'unambiguous errors'.
Done.
Credits:
https://www.codeproject.com/questions/1036066/how-to-hide-url-parameter-asp-net-mvc
https://msdn.microsoft.com/en-us/library/aa719858(v=vs.71).aspx
HttpModule Init method were not called
C# Please specify the assembly explicitly in the type name
https://stackoverflow.com/questions/1391060/httpmodule-with-asp-net-mvc-not-
being-called
You can create a custom html helper to encrypt query string and use custom action filter attribute for decryption and getting original values back. You can implement it globally so won't take much of your time. You can take reference from here Url Encryption In Asp.Net MVC. This will help you out with custom helper and custom action filter attribute.
It's likely pointless to globally encrypt all the url parameters (query string). Most parameters are display items used by HttpGet. If everything is encrypted then this won't make for a very informative page. However if there are sensitive parameters that are only hidden fields (keys) on the client that eventually are returned to the server to identify a record, this might be worth encrypting.
Consider this viewModel:
public viewModel
{
public int key {get;set;} // Might want to encrypt
public string FirstName {get;set;} // Don't want this encrypted
public string LastName {get;set;} // Don't want this encrypted
}
The viewModel gets converted into a query string, something close to....
appName.com/index?Id=2;FirstName="John";LastName="Doe"
If this viewModel is passed as a query string, what's the point in encrypting the first and last names?
It should be noted that query strings are HttpGet. HttpPost use the session to pass values not query strings. HttpPost sessions are encrypted. But there is overhead to httpPost. So, if your page does actually contain sensitive data that needs to be displayed (perhaps the users current password) then consider going to HttpPost instead.

Resources