Sage Pay Encryption for Request Data for Form Integration - encryption

I am working for an application which is build in VB.Net 2003 (Framework 2.0). I have to integrate SagePay using Form Integration method. Can anyone please provide me code by which I can encrypt the the request data. Request data sample is given below;
VendorTxCode=TxCode-1310917599-223087284&Amount=36.95&Currency=GBP&Description=description&CustomerName=Fname
Surname&CustomerEMail=customer#example.com&BillingSurname=Surname&BillingFirstnames=Fname&BillingAddress1=BillAddress Line
1&BillingCity=BillCity&BillingPostCode=W1A
1BL&BillingCountry=GB&BillingPhone=447933000000&DeliveryFirstnames=Fname&DeliverySurname=Surname&DeliveryAddress1=BillAddress
Line 1&DeliveryCity=BillCity&DeliveryPostCode=W1A
1BL&DeliveryCountry=GB&DeliveryPhone=447933000000&SuccessURL=https://example.com/success&FailureURL=https://example.com/failur
e

You could have a look at https://www.sagepaylabs.com/aes_vb.zip - this might point you in the right direction.

Related

Using PBKDF2 to store password hashes in Jackrabbit OAK

We are migrating an application that was built on Sling 6 & Jackrabbit to Sling 10 & Oak. We are using Oak 1.6.8 which is the version used in the example sling 10 application. We had previously built our own authenticators & login plugins to use CryptedSimpleCredentials and keep passwords encrypted in the JCR. It looks like that is now the standard in Oak using CredentialsImpl. I'm trying to decide if we can drop our custom code and just configure oak properly. I've set the UserConfigurationImpl.config with the following values:
passwordHashAlgorithm="PBKDF2WithHmacSHA256"
passwordHashIterations="1000"
passwordSaltSize="20"
I took the HashAlgorithm key from a comment in org.apache.jackrabbit.oak.spi.security.user.util.PasswordUtil.generatePBKDF2(...). Following the code in PasswordUtil, the PBKDF2 prefix will generate the digest using a secret key.
Stepping through the code, I can see that during org.apache.jackrabbit.oak.security.user.UserInitializer.Initialize(...) the admin user is created (:139). The hash created for the password uses above mentioned methods and produces a hash with salt & iterations :
{PBKDF2WithHmacSHA256}b7dab4b06ad4be41-1000-8675468f4239a321b3dc8b9989a2fae0
However, when trying to login with the admin user, it is not able to authenticate the user. PasswordUtil.isSame() fails to recognize the algorithm when calling extractAlgorithm(hashedPwd) because message.digest("PBKDF2WithHmacSHA256") is invalid.
I have not been able to find any other people looking for help with this topic, which leads me to believe that maybe I have a fundamental misunderstanding that I can't see. Any and all help would be appreciated.
It looks like this was a bug fixed by OAK-7778.

DataPower monitoring or validation techniques

How to improve datapower monitoring ? I want to improve our monitoring techniques say for example, want to check that all objects (FSH /MQFSHs, SSl proxy, crypto profile etc) are up and incase if it goes down , should be notified by email or something. checking number of files in file management ondisk folders.Basically validate the adapter after deployment (we use soapUi to test adapter functionality, however something else to improve or added validation).please suggest any ideas that can be implemented as a process improvement on Datapower
For example you can get the status off all your domains using this soma call. You can test this using soap UI. You can get the list of various soma calls using the datapower mgmt wsdl (available in datapower store directory).
<!-- get all the domains -->
<xsl:variable name="domainsList">
<dp:url-open target="{$XML-MGMT-URL}" response="responsecode">
<env:Envelope xmlns:env="http://schemas.xmlsoap.org/soap/envelope/">
<env:Body>
<dp:request xmlns:dp="http://www.datapower.com/schemas/management">
<dp:get-status class="DomainStatus"/>
</dp:request>
</env:Body>
</env:Envelope>
</dp:url-open>
</xsl:variable>
Try using SOMA commands of XML management interface to check the object status.
I am not sure if this is the best approach but this is how i have implemented it. You can always create a testing service in DataPower with/without interactive java application to perform all the soap test you are performing using soapUI. You can perform SOMA/AMP calls to check the status of objects, ping external services, etc. You can schedule these test on a regular interval or manual.
Depending how you set it up, you can either generate an email with status of each object/service you are testing or create a html dashboard that records the current status of everything.

Convert FedEx beta webservice to live

I'm trying to convert beta web-service to live. after removing beta word from the web-service [i.e. in web.config: endpoint address="https://wsbeta.fedex.com:443/web-services/rate"], this web service is not fetching the value. Any suggestions, why this is happening or I'm missing any procedure. Any guidance/suggestions on this.
To move a system from testing to production is not only needed to remove the word beta from the testing URL address:
From:
"https://wsbeta.fedex.com:443/web-services/rate";
To:
"https://ws.fedex.com:443/web-services/rate";
(when doing this change make sure you are replacing ALL occurrences of wsbeta. to ws. in your solution.)
But also you need to change the MeterNumber, include the Password and Key. When you sign up for the Production Key you will get all this information on the email and also you'll get the Key as soon as you sign up (you won't get that key on the email so be careful and write down that information).
With those pieces of information you should be good to go. If you are getting an exception that's a different story, let us know what exception you are getting.

Reporting in ASP.NET

I have an SQL Database and an ASP.NET website built to put data into the database.
One of the project requirements is to build a system that would let the user upload a Crystal Report to the server and run it as needed. This way, the user could create a customized report (for then turning into management, customers) that wouldn't force them to go through a developer.
I'm looking for suggestions on how to accomplish this goal.
Currently, I'm looking for a way to redirect the database connection in the Crystal Report from the database it was developed with to the database it will eventually run on. However, There doesn't seem to be a simple way to do this.
I'm also investigating the ReportViewer object. However, all the code I have seen involves specifying the query for the report in the code, which isn't acceptable.
One option (which I don't like at all) is to let them write their own queries so they can copy the results into Excel. This would mean a blank textbox and information about the structure of the database. Not a good idea for multiple reasons.
Another option is to create one report for each table (and maybe a few extras), let the user copy the data they want into Excel, and go on their merry way.
tl;dr How do I build a flexible reporting system?
=========================================
Continuation: 08/20/2012
I have decided to go the route of b.pell's extension methods. So far, it has gotten me closer than anything else. My code to bind to the CrystalReportViewer is below:
CrystalReportSource rs = new CrystalReportSource();
rs.Report.FileName = Server.MapPath("ReportFiles/") + Request["reportname"];
string connstring = System.Web.Configuration.WebConfigurationManager.ConnectionStrings["myConnectionString"].ConnectionString;
rs.ReportDocument.ApplyCredentialsFromConnectionString(connstring);
rs.ReportDocument.ApplyNewDatabaseName("myDBName", "mySchemaName");
rs.ReportDocument.Refresh();
CrystalReportViewer1.ReportSource = rs;
This comes very close to working. It works fine on my dev machine, but when I run the code on the server, it gives the following error:
Logon failed.Error in File CrystalReport2 {5D2E82E5-783E-4DFD-A770-C8AE72A51E4E}.rpt:
Unable to connect: incorrect log on parameters. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Runtime.InteropServices.COMException: Logon failed.Error in File CrystalReport2 {5D2E82E5-783E-4DFD-A770-C8AE72A51E4E}.rpt: Unable to connect: incorrect log on parameters. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.
The error is in this line in the code:
crTable.Location = String.Format("{0}{1}", prefix, crTable.Location.Substring(crTable.Location.LastIndexOf(".") + 1))
When I remove the call to ApplyNewDatabaseName, I am asked to enter the Server Name, the Database name, the Username and the Password or to select Integrated Security. I can't enter the Database Name or the Server Name (those fields are disabled).
Any thoughts?
I think what you're looking for is the Reporting Services, part of Business Intelligence
Or maybe you can setup a UI that let the users pick the tables and columns they need for the report (this way you can limit the information they can access) an write a Dinamic Query Builder Function or something like that.
I answer the changing Crystal Reports connection question a lot (it's something I'd think Crystal would make easier, but I wonder if they don't because that's what their server product does). :D Anyway, you can set the database credentials at runtime. Crystal is very particular in the order it's done, but I have some code that I turned into extension methods that do the trick. This code will go through the main report and all sub reports and change the connection information. This assumes that all sub reports connect to the same database that the main report does (if not, you'll need to modify it to handle multiple connections, but this rarely comes up at least with what I do).
Extension methods to change connection info: http://www.blakepell.com/2012-05-22-crystal-reports-extension-methods
It would be used something like this (although, you're binding to a viewer probably and not exporting, so you could ignore that, this is just for example).
Using rd As New ReportDocument
rd.Load("C:\Temp\CrystalReports\InternalAccountReport.rpt")
rd.ApplyNewServer("serverName or DSN", "databaseUsername", "databasePassword")
rd.ApplyParameters("AccountNumber=038PQRX922;", True)
rd.ExportToDisk(ExportFormatType.PortableDocFormat, "c:\temp\test.pdf")
rd.Close()
End Using
System.Diagnostics.Process.Start("c:\temp\test.pdf")
You could use the Crystal Viewer at this point to deliver the reports and store the report in a database or on the file system (with a db meta data table) and have some predefined connections the user could select from that would be applied when it is run.
You also have the option to write your own front end. In this scenario a user would select a report from your meta data (you could put whatever security on it you wanted, I use AD). Then you can read the report parameters in and lay them out on the web form. When the user fills them in, you then sanatize them and pass them to the report via these extensions and you can output Excel, PDF, Word Doc, RTF, etc. A little more overhead and not the nice preview view, but can work well (I've done something like this in the past). Hope this helps.
About "...let them write their own queries" part of your question.
The solution can be to use some query builder component with friendly user interface which hides from users the complexity of your database and avoid any possible SQL injections.
There are few such products on the market. One of them is called EasyQuery, another one is build by Aspose if I'm not wrong. Try to search in Google for "query bulider for asp.net" or ".net query builder component".

.netCART Credit Card Decryption - IIS 7 App Pool and Decryption issue

I've got a site using .netCART. It's running fine in production with Windows Server 2003 and .NET 2.0. On the new server (Windows Server 2008) everything is working except for credit card decryption in the store admin. No errors are being sent, no exceptions thrown, just the encrypted string being output to the screen instead of a decrypted credit card number.
Dim strCCEncrypt As String
strCCEncrypt = Trim(DataRow.Item("CreditCard"))
strCCEncrypt = tools.Decrypt(strCCEncrypt) 'tools is a .netCART utility
Has anyone had experience with .netCART, or seen this issue before?
EDIT:
After much investigating yesterday, it seems as though the problem is tied to the App Pool (which is running in classic pipeline mode on .NET 2.0), and Decryption. Can anyone tell me what the processes or services are that are tied to the default app pool which help handle decryption?
Don't know where your specific problem is, but that code snippet is equivalent to this:
Dim CCEncrypt As String = tools.Decrypt(DataRow("CreditCard").ToString().Trim())
To explain the changes:
You can skip the .Item part because it's an indexer for DataRow
But you should call .ToString(), in case of other types or DbNulls
Then use the string type's .Trim() method rather than the VB Trim() function. Trim() and other old string functions exist solely for backwards compatibility. You're better off becoming accustom to the methods attached to the string type.
In .Net, it's no big deal to declare a variable and assign to it on the same line
And in .Net, Microsoft's style guidelines specifically recommend against any hungarian-notation type warts on variable names.
The end result of this problem was that I used Reflector to get the method out, provide the key manually to perform the decryption, since the decrypt method shown above just provided a call to a method that took the key.
Check the machinekey element in your web.config. Is it possible the credit cards were encrypted with a different key than you are trying to decrypt them with?

Resources