asp.net mvc integration test - asp.net

Hi Im doing TDD for an asp.net mvc project, I need to be able to do end to end testing for sending a request to the controller action all the way to the repository. I have tried using the code here but unfortunately I can't get this to run and I'm running out of time, does anyone know any other way to fake an http request and populate request post parameters in a test scenario?
My controller action is as follows:
[HttpPost]
public ActionResult CreateUser(User user)
{
}
So I need to basically do an http request to populate this User object and hopefully save it to a test repository.

As you posted the link I'll take an extract from Steve Sanderson's blog:
Integration tests test your entire software stack working together. These tests don’t mock or fake anything (they use the real database, and real network connections) and are good at spotting if your unit-tested components aren’t working together as you expected. In general, it’s best to put most of your effort into building a solid suite of unit tests, and then adding a few integration tests for each major feature so you can detect any catastrophic incompatibilities or configuration errors before your customers do.
You shouldn't be faking HTTP requests at this stage as an integration test inherantly tests every component together.
Try some type of browser automation framework:
http://blog.stevensanderson.com/2010/03/30/using-htmlunit-on-net-for-headless-browser-automation/
http://www.codeproject.com/KB/cs/mshtml_automation.aspx

If you want to do full integration testing, then test your application from user prospective. Create test cases like:
Log in as admin
Go to Users page
Add User with name "User1"
Check that user with name "User1" listed in the Users grid.
And automate such tests using Selenium or Watin. See example here

You may also want to take a look at the Verde framework. Semantically the tests look similar to Steve Sanderson's MvcIntegrationTestFramework with the key difference being that Verde executes tests in the context of your actual IIS AppDomain (via a browser-based test runner) rather than a programmatically created one. This provides a couple of advantages: First it is a more realistic emulation of your actual application's configuration, network topology, security settings, etc. Secondly you can automate running of the tests as a post-deployment step or could even run the tests automatically as part of application monitoring in production. Here is an example Verde test taken from the MvcMusicStore sample that is included in the source code on GitHub:
[IntegrationTest]
public void Index_Load_ExpectedHtml()
{
// Get a product to load the details page for.
var album = storeDB.Albums
.Take(1)
.First();
using (var scope = new MvcExecutorScope("Store/Details/" + album.AlbumId))
{
Assert.AreEqual(200, scope.HttpContext.Response.StatusCode);
Assert.IsTrue(scope.Controller is StoreController);
Assert.AreEqual("Details", scope.Action);
var model = scope.Controller.ViewData.Model as Album;
Assert.IsNotNull(model);
Assert.AreEqual(album.AlbumId, model.AlbumId);
Assert.IsFalse(String.IsNullOrEmpty(scope.ResponseText));
// Load the ResponseText into an HtmlDocument
var html = new HtmlDocument();
html.LoadHtml(scope.ResponseText);
// Use ScrappySharp CSS selector to make assertions about the rendered HTML
Assert.AreEqual(album.Title, html.DocumentNode.CssSelect("#main h2").First().InnerText);
}
}
There is a NuGet package which makes it very easy to add to your MVC project.
http://dvonlehman.github.com/Verde/
https://nuget.org/packages/Verde/0.5.1

Related

Mock real gRPC server responses

We have a microservice that needs to be integration tested (real calls, but no network communication with anything outside of the test namespace in kubernetes) in our pipeline. It also relies on an external gRPC server which we have no control over.
Above is a picture of what we'd like to have happen. The white box on the left is code that provides the Microservice Boundary with 'external' data. It then keeps calling the Code via REST until it gets back the proper number of records or it times out. The Code pulls records from an internal database, as well as data associated to those records from a gRPC call. Since we do not own the gRPC service, but are doing integration tests, we need a few pre-defined responses to the two gRPC services we call (blue box).
Since our integration tests are self-contained right now, and we don't want to write an entirely new actual gRPC server implementation just to mimick calls, is there a way to stand up a real gRPC server and configure it to return responses? The request is pretty much like a mock setup, except with an actual server.
We need to be able to:
give the server multiple proto files to interpret and have it expose those as endpoints. Proto files must be able to have different package names
using files we can store in source control, configure the responses to each call
able to run in a linux docker container (no windows)
I did find gripmock which seemed almost exactly what we need, but it only serves one proto file per container. It supposedly can serve more than one, but I can't get it to work and their example that serves two files implies each proto file must have the same package name which will likely never happen with our scenarios. In the meantime we are using it, but if we have 10 gRPC call dependencies, we now have to run 10 gripmock servers.
Wikipedia contains a list of API mocking tools. Looking at that list today there is a commercial tool that supports gRPC called Traffic Parrot which allows you to create gRPC mocks based on your Proto files. You can give it multiple proto files, store the mocks in Git and run the tool in Docker.
There are also open-source tools like GripMock but it does not generate stubs based on Proto files, you have to create them manually. Also, the project up to today was not keeping up to date with Proto and gRPC developments i.e. the package name issue you have discovered yourself above (works only if the package names in different proto files are the same). There are a few other open-source tools like grpc-wiremock, grpc-mock or bloomrpc-mock but they still lack widespread adoption and hence might be risky to adopt for an important enterprise project.
Keep in mind, the mock generated will be only a test double, it will not replicate the full behaviour of the system the Proto file corresponds to. If you wanted to also replicate partially the semantics of the messages consider doing a recording of the gRPC messages to create the mocks, that way you can see the sample data as well.
Take a look at this JS library which hopefully does what you need:
https://github.com/alenon/grpc-mock-server
Usage example:
private static readonly PROTO_PATH: string = __dirname + "example.proto";
private static readonly PKG_NAME: string = "com.alenon.example";
private static readonly SERVICE_NAME: string = "ExampleService";
...
const implementations = {
ex1: (call: any, callback: any) => {
const response: any =
new this.proto.ExampleResponse.constructor({msg: "the response message"});
callback(null, response);
},
};
this.server.addService(PROTO_PATH, PKG_NAME, SERVICE_NAME, implementations);
this.server.start();

How to properly configure Spring Datasource for an Elastic Beanstalk app?

I'm running into an issue integrating Spring Security with my Elastic Beanstalk app backed by a MySql database. If I deploy my app I'm able to login in correctly for some time but eventually I'll start to receive login errors without an exception being thrown so I'm unable to get any useful information about the issue. I've downloaded the logs as well and can't see anything of value. I can see where the logs show accessing the public page, attempting to access the private section, returning the login page, and then the loginError page; however, nothing about any issue.
Even though I'm unable to login through a browser I am able to login if I run the app from an IDE as well as view the db in MySQL Workbench. This suggests to me the problem is due to some persistent state on the server.
I've had a similar problem before with another Beanstalk app using Spring Security and was able to resolve it by setting application properties as follows:
spring.datasource.test-on-borrow=true
spring.datasource.validation-query=SELECT 1
I'm using a more recent version of Spring than that app and the properties have been changed to specific datasources so I tried adding the following properties:
spring.datasource.tomcat.test-on-borrow=true
spring.datasource.tomcat.validation-query=SELECT 1
When that didn't work I added another based on an answer to a similar question here; now the properties are:
spring.datasource.tomcat.test-on-borrow=true
spring.datasource.tomcat.test-while-idle=true
spring.datasource.tomcat.validation-query=SELECT 1
That seemed to work (possibly due to less login activity) but eventually resulted in the same behavior .
I've looked into the various properties available but before I spend a lot of time randomly setting and/or overriding default settings I wanted to see if there's a reliable way to deal with this.
How can I configure my datasource to avoid login errors after long periods of time?
This isn't a problem of specific configuration values but with where those configurations reside. The default location for the application.properties (/resources; Intellij) is fine for deploying as a jar with an embedded Tomcat server but not as a war with a provided server. The file isn't found/used so no changes to the file affect the one given by AWS.
There are a number of ways to handle this; I chose to add an RDS configuration bean in my SpringBootServletInitializer:
#Bean
public RdsInstanceConfigurer instanceConfigurer() {
return () -> {
TomcatJdbcDataSourceFactory dataSourceFactory =
new TomcatJdbcDataSourceFactory();
// Abondoned connections...
dataSourceFactory.setRemoveAbandonedTimeout(60);
dataSourceFactory.setRemoveAbandoned(true);
dataSourceFactory.setLogAbandoned(true);
// Tests
dataSourceFactory.setTestOnBorrow(true);
dataSourceFactory.setTestOnReturn(false);
dataSourceFactory.setTestWhileIdle(false);
// Validations
dataSourceFactory.setValidationInterval(30000);
dataSourceFactory.setTimeBetweenEvictionRunsMillis(30000);
dataSourceFactory.setValidationQuery("SELECT 1");
return dataSourceFactory;
};
}
Below are the settings that worked for me.
From Connection to Db dies after >4<24 in spring-boot jpa hibernate
dataSourceFactory.setMaxActive(10);
dataSourceFactory.setInitialSize(10);
dataSourceFactory.setMaxIdle(10);
dataSourceFactory.setMinIdle(1);
dataSourceFactory.setTestWhileIdle(true);
dataSourceFactory.setTestOnBorrow(true);
dataSourceFactory.setValidationQuery("SELECT 1 FROM DUAL");
dataSourceFactory.setValidationInterval(10000);
dataSourceFactory.setTimeBetweenEvictionRunsMillis(20000);
dataSourceFactory.setMinEvictableIdleTimeMillis(60000);

which s best way to test the database packages?

I am currently working on a project where we need to test the database packages and functions.
We need to provide the input parameters to the database package and test the packages returns the expected value, also we want to test the response time of the request.
Please advice, if there is any tool available to perform this or we can write our test cases in Junit or some other framework.
Which one will be best approach?
I've used a more native approach when I had to do DWH testing. I've arranged the Test framework around the Dev data integration framework that was already in place. So i had a lot of reusable jobs, configurations and code. But using OOP like you suggest
write our test cases in Junit
is a way to go too. But keep in mind that very often the DWH design is very complex (with a lot of aspects to consider) and interacting with the Persistence layer is not always the best candidate for testing strategy. A more DB oriented solution (like tSQLt) offers a significant performance.
Those resources helped me a lot:
dwh_testing
data-warehouse-testing
what-is-a-data-warehouse-and-how-do-i-test-it
My framework Acolyte provides a JDBC driver & tools, designed for such purposes (mock up, testing, ...): http://tour.acolyte.eu.org
It's used already in some open source projects (Anorm, Youtube Vitess, ...), either in vanilla Java, or using its Scala DSL.
handler = handleStatement.withQueryDetection(...).
withQueryHandler(/* which result for which query */).
withUpdateHandler(/* which result for which update */).
// Register prepared handler with expected ID 'my-unique-id'
acolyte.Driver.register("my-unique-id", handler);
// then ...
Connection con = DriverManager.getConnection(jdbcUrl);
// ... Connection |con| is managed through |handler|

Programming a Web Portal for Microsoft Dynamics CRM

I'm working on a web portal for customers that will connect to Microsoft Dynamics. I don't want to make Dynamics CRM directly a internet facing deployment (IFD), so I'd like to use a separate database that the web interface interacts with and then use web services to move the data between the web portal database and Dynamics CRM.
I'm just looking for thoughts on whether this is the best way to proceed and whether there are any good code examples, etc. that I can look at for implementing this?
I saw Microsoft has a Customer Portal but it looks like it requires (at a cursory glance) an IFD deployment - which I don't want.
First, after creating your ASP.NET project (WebForms or MVC 3), add the following references:
Microsoft.crm.sdk.proxy.
Microsoft.xrm.sdk.
System.Runtime. Serialization.
System.ServiceModel.
In your code-behind Create a class then add the following code:
private IOrganizationService GetCrmService(string userName, string password, string domain, Uri serviceUri)
{
OrganizationServiceProxy _serviceProxy;
ClientCredentials credentials = new ClientCredentials();
credentials.Windows.ClientCredential = new System.Net.NetworkCredential(userName, password, domain);
//credentials.UserName.UserName = userName; // uncomment in case you want to impersonate
//credentials.UserName.Password = password;
ClientCredentials deviceCredentials = new ClientCredentials();
using (_serviceProxy = new OrganizationServiceProxy(serviceUri,
null,
credentials,
deviceCredentials))
{
_serviceProxy.ServiceConfiguration.CurrentServiceEndpoint.Behaviors.Add(new ProxyTypesBehavior());
return (IOrganizationService)_serviceProxy;
}
}
If you want to retrieve multiple records:
string fetch = #"My Fetch goes here";
EntityCollection records = getCrmService().RetrieveMultiple(new FetchExpression(fetch));
I highly recommend to download the SDK or check this
You'll find many samples and walkthroughs which will help you to build good portals.
I think it's a good strategy because:
It allows you to asynchronously put the data entered on the website into the CRM. This decoupling ensures neither the CRM nor the Website will become eachother's bottleneck.
Only the intermediate service layer is internet facing, so you'll be in control over what CRM information would be disclosed/open for alteration if this service layer is compromised.
The architecture you're after is reminiscent of the way the CRM Asynchronous Service works (asynchronous plugins and workflows work this way).:
A job is put in a queue (table) in the CRM DB.
A scheduled service awakes every x seconds and fetches the latest y records from the queue table.
The service performs each job and writes the result (success, error message log) back to the queue table's records.
So the thing that is probably hardest is writing a good scheduled service that never throws an exception (but always digests it) and properly logs the results back to the DB.
To learn more about the Dynamics CRM's "Asynchronous Service Architecture", refer to the following: http://msdn.microsoft.com/en-us/library/gg334554.aspx
It looks like a good approach.
It will improve the performance of both the portal and CRM.
The data shown on portal is NEARLY realtime. i.e it is NOT realtime.
Throughout the development, you better keep checking that there is not TOO MUCH async processing to keep the CRM server busy all time.
I don't think, that the accelerators/portals REQUIRE CRM to be an IFD instance, I guess only the portal part needs to be Internate facing (of course to make it usable for the purpose!)
Anwar is right, SDK is a good lauchpad for such research.
Customer Portal Does not require IFD deployment. And if you do not like the Customer Portal you can always use SDK Extension for Portal development (microsoft.xrm.client.dll & microsoft.xrm.portal.dll and portalbase solution) which are all included in SDK.
There is a great resource regarding how to build portal by using SDK Portal Extenstion.
Dynamics CRM 2011 Portal Development

Return values from exe in javascript

i have to call an executable in the client machine in asp.net and get the return parameters, i been looking for an example but i couldn't find it.
it this possible to recover the output parameters from one exe in JavaScript?
i know that can i write:
var WshShell = new ActiveXObject("WScript.Shell");
var oExec = WshShell.Exec("My.exe");
but the clients executable returns 0 or 1 that values are the ones i need to collect
Thanks in advance
Browser-based JavaScript can't call executable files on client machines; to do so would be a catastrophic security problem. If you have to run an executable on the client machine, consider asking the user to install a .NET application, an ActiveX control, or something like Java if you want to be platform-independent.
Depending on what you're trying to do, you may not need to run an EXE on the client machine; you can do a LOT with standard cloud-type scenarios (JS or SilverLight on the client, Web services or WCF on the server). Without more information about your situation, however, it's impossible to tell.
EDIT: Based on your comments that you're using the ActiveXObject.Exec method, you can use the StdOut property of the WshScriptExec object that method returns. From MSDN's article on the StdOut property:
if (!oExec.StdOut.AtEndOfStream)
{
input += oExec.StdOut.Read(1);
//...
}

Resources