Testing Analysis services - automated-tests

We are looking to build a cube in Microsft SQL server analysis services but would like to be able to use some of the automated testing infrastructure we have.
such as Cruise control for automated build, deployments and test.
I am looking for anyone that can give me any pointers on building tests against analysis services, and also any experience with adding these to a build pipeline.
Also if automation is not possible some manual test methods.

Recently I came upon BI.Quality project on codeplex and from what I can tell it's very easy to learn and to integrate into existing deployment process.

There is another framework named NBi. You've additional features compared to BI.Quality as to check the existence of a measure, dimension, attributes, the order of members, the count of members. Also when comparing two result sets it's often easier to spot the difference between them with NBi. The edition of the test-suites is also done in one single xml file validated by an XSD (better user-experience).

Related

Test automation for microservices architecture

I am in charge of implementing QA processes and test automation for a project using microservices architecture.
Project has one public api that makes some data available. So I will automate API tests. Tests will live in one repository. This part is clear to me, I did this before in other monolith projects. I had one repo for API tests. And possibly another repo for selenium tests.
But then here the whole poduct consists of many microservices that communicate via restful apis and/or rabbit queues. How would I go about automating tests for each of these individual servicess? Would tests for each individual service be in a separate repo? Note: services are written in Java or PHP. I will automate tests with Python. It seems to me that I will end up with a lot of repos for tests/stubs/mocks.
What suggestions or good resources can community offer? :)
Keep unit and contract tests with the microservice implementation
Component tests make sense in the context of composite microservices,
so keep them together
Have the integration and E2E tests in a
separate repo, grouped by use cases
For this kind of testing I like to use Pact. (I know you said Python, but I couldn't find anything similar in that space, so I hope you (or other people searching) will find this excellent Ruby gem useful.)
For testing from outside in, you can just use the proxy component - hope this at least gives you some ideas.
Give each microservice its own code repository, and add one for the cross-service end-to-end tests.
Inside a microservice's repository, keep everything that relates to that service, from code over tests to documentation and pipeline:
root/
app/
source-code/
unit-tests/ (also: integration-tests, component-tests)
acceptance-tests/
contract-tests/
Keep everything that your build step uses in one folder (here: app), probably with sub-folders to distinguish source code from unit tests, integration tests, and component tests.
Put tests like acceptance tests and contract tests that run in later stages of the delivery pipeline in own folders. This keeps them viually separate. It also simplifies creating separate build/test steps for them, for example by including own pom.xml's when using Maven.
If a developer changes a feature, he will need to change the tests at the exact same time to ensure the two fit together. Keeping code and tests in the same repository keeps the two in sync in a natural way.

How can I share a class library between multiple applications?

We have numerous small applications in our organization, most of them .NET web applications. Each of these has a seperate authorization system.
I would like to create 1 library of functions to do authorization checks to use in all these applications.
I would like to have this library centralized, so that if i want to make changes to it, i don't have to go in each application seperatly and update the reference.
What would be a good way of doing this, or should I take a different approach?
PS: Since I will be looking to use this concept also for other situations, I'm not looking for authorization specific solutions.
Don't do this.
Everything will be fine for ages, and then an error will creep in and all your sites will go down at once. Instead look at automating your build, test and deploy cycle with tools like TeamCity, NUnit and OctopusDeploy that so that you can regression test each and every app, then deploy the new tested package (i.e. web site + security dll) quickly and easily.
The way I would do this is to pull your library out into a separate dll and start using in one application (I'll assume this is of course fully unit tested :-) ). Next use NuGet to build yourself a package containing the dll. You can host the Nuget package (just put it on a share to start with) and then re-use it all your other applications.

Sharing stored procedures across multiple apps

Team A has an enterprise app that uses ADO.NET for data access that executes stored procedures. The data access is encapsulated in it's own project (let's call it DAL.dll)
Team B is creating another unrelated app that's reusing the stored procedures in the enterprise app. This app is currently using the MS application block for data access. The issue we run into is that whenever Team A make any change to the input/output params in the stored procedures, there is a runtime error in Team B's app and this app needs to be updated to accommodate the additional params (or params that were removed). So, most of these go unnoticed until a user complains. At the very least, we would like to have the app throw a compilation error so that the build process warns us of the changes made.
One way to do this is to have Team B's project add a reference to the DAL.dll
I'd like to know if there are any other cleaner ways of solving the issue. We are ready to replace Team B's MS Data application block to use a different technology (Entity Framework?) if necessary.
Among the other answers, I'd strongly suggest getting those stored procedures into source control, in a Database Project. You then may be able to use the features of your source control system to do several things:
Lock some of the code so that it cannot be changed
Give you notifications if the code is changed
Warn you if the stored procedures change in a way that would prevent them from being called
Branch the stored procedures so that each team can have their own version of changed code, while keeping the unchanged stored procedures common. You of course will need to separate the different versions in the database.
I agree with the other posters on this thread that you should not share stored procedure's across different .NET DLL's, that is just a recipe for disaster. I would also shy away from ORM's like Entity Framework if you are doing anything at all complicated with your database schema because ORM's excel at getting a simple object model translated from your .NET application classes into SQL tables and SP's, but traditionally do poorly at optimizing them for performance on the database side. There will be people who claim otherwise, and they may have a valid point if you are an expert in wrangling an ORM to do waht you want like they are, but chances are you are not and it will cause you headaches in the long run.
A shared data access layer might work, but conceptually you are then just changing the implementation of the dependency from some code that a DBA wrote to some code that a .NET programmer wrote. Yes, you can use integration tests to achieve better verifiability, but the same case could be made for SQL with tools like Red Gate's SQL Test. I would shy away from this approach if the two applications are already experiencing some sort of pain from sharing SP's. That is an indication that the dependency just should be done away with.
If it were up to me, I'd just make a new schema for Team B's app. You can read more about schemas in SQL Server here: MSDN Schema description for 2008 R2. You can think of them as namespaces for SQL Server but with some additional bells and whistles like permission and access control. Separating out your different applications into separate schemas on the same shared database will probably make for the most flexible implementation in the long run.
unrelated app that's reusing the stored procedures in the enterprise app
If these two application are really unrelated why are those sharing procedures or even the same database. I know this is a long read, but I recommend you to read this: A Better Path to Enterprise Architectures
The partioning concept in there relates to the bounded context in Domain driven design:
Multiple models are in play on any large project. Yet when code based on distinct models is combined, software becomes buggy, unreliable, and difficult to understand. Communication among team members becomes confusing. It is often unclear in what context a model should not be applied.
Therefore: Explicitly define the context within which a model applies. Explicitly set boundaries in terms of team organization, usage within specific parts of the application, and physical manifestations such as code bases and database schemas. Keep the model strictly consistent within these bounds, but don’t be distracted or confused by issues outside.
It is expected you end with problems when you don't explicitely deal with this. You're lucky you're seeing early failures, as it can turn into problems much harder to find on the long run.
Analyze the problem again with the above in mind. Consider if you're missing some explicit context where this common functionality should live.
My question is: which team owns the store procedured and the database shared? Usually as a good architecture/design, you should not have two different apps sharing same database / procedures.
A better way to share data/functionality between two different applications is through a services or API, so the team who owns the functionality would be responsible to maintain it.
Also, have a good communication between both teams is highly recommend.
Depending on the owner of the DAL project, you could host web services and share the API. That way, you separate the Data Access Layer from the business logic, which allows anyone to use the same DAL without having to publish it to each different location.
From my point of view, it looks like both Team A and Team B should share the same core model and look at Multitier architecture as a possible solution.
It sounds like it would make sense to create a shared DAL that both applications can share.
I would add unit tests (or really integration tests) to make sure the DAL is compatible with the apps after changes. That way your tests would fail if incompatible changes have been made
"I'd like to know if there are any other cleaner ways of solving the issue."
The cleanest way is for Team B to sit down with Team A and encapsulate the relevant business logic into a shared API. It doesn't matter so much how you implement that API; what does matter is that the API's interface is documented and versioned so everyone knows what to expect.
One reasonable mechanism for this in a .NET environment is to use Microsoft's WebAPI.
In short, the question of "how do we share a stored procedure?" is most likely looking at the wrong level of abstraction.

Implementation of a ASP.NET based portal-like application

There is the requirement, to write a portal like ASP.NET based web application.
There should be a lightweigted central application, which implements the primary navigation and the authentication. The design is achieved by masterpages.
Then there are several more or less independent applications(old and new ones!!), which should easily and independent be integrated into this central application (which should be the entry point of these applications).
Which ways, architectures, patterns, techniques and possibilities can help and support to achieve these aims? For example makes it sense to run the (sub)applications in an iframe?
Are there (lightweighted and easy to learn) portal frameworks, which can be used (not big things like "DOTNETNUKE")?
Many thanks in advance for you hints, tips and help!
DON'T REINVENT THE WHEEL! The thing about DotNetNuke is that it can be as big or as small as you make it. If you use it properly, you will find that you can limit it to what you need. Don't put yourself through the same pain that others have already put themselves through. Unless of course you are only interested in learning from your pain.
I'm not saying that DNN is the right one for you. It may not be, but do spend the time to investigate a number of open source portals before you decide to write your own one. The features that you describe will take 1000s of hours to develop and test if you write them all from scratch.
#Michael Shimmins makes some good suggests about what to use to implement a portal app with some of the newer technology and best practice patterns. I would say, yes these are very good recommendations, but I would encourage you to either find someone who has already done it this way or start a new open source project on codeplex and get other to help you.
Daniel Dyson makes a fine point, but if you really want to implement it your self (there may be a reason), I would consider the following components:
MVC 2.0
Inversion of Control/Dependency Injection (StructureMap for instance)
Managed Extensibility Framework
NHibernate (either directly or through a library such as Sh#rp or Spring.NET
A service bus (NServiceBus for instance).
This combination gives you flexible user interface through MVC, which can be easily be added to via plugins (exposed and consumed via MEF), a standard data access library (NHibernate) which can be easily configured by the individual plugins to connect to specific databases, an ability to publish events and 'pick them up' by components composed at runtime (NServiceBus).
Using IoC and DI you can pass around interfaces which are resolved at runtime based on your required configuration. MEF gives you the flexibility of defining 'what' each plugin can do, and then leave it up to the plugins to do so, whilst your central application controls cross cutting concerns such as authentication, logging etc.

Recommendations on how to decouple services (RSS, REST API) from my UI (webforms) when they share a common model?

I have a web application that is arranged into data, business and UI projects. As the system evolves changes are deployed by building all three projects and deploying them in one package. This has worked well and has allowed the illusion of “three tiers” without tackling the communications, versioning issues of truly separate systems.
So along comes a request for XML summaries of some of the data and my thoughts turn to a fancy WCF service that, one day, could be my “Web API” (ahh… the mind.. what a evil little monkey it is). So, assuming this survives the “is that really the best idea?” test here is my question:
What structure have you had the most success with when posed with two
evolving “clients” serving content from a single evolving “model”?
James, your question is rather board as there are a large number of variables that go into choosing the right type of architecture for your needs. I would recommend reading the patterns & practices Application Architecture Guide 2.0 to better understand the options and pick the best one that suits your individual needs.

Resources