How can one set up qunit to run integration tests against a remote site? I've used ember-cli test runner which has qunit + testem built in to run interation tests locally in an ember-cli project. How it does this, is to load the entire ember app in the qunit container then run the tests which can be unit or integration tests. This approach is also good if I can't have a JRE dependancy.
I think it may be possible to be able to load fooDomain.com/somepath in an iframe and then run the tests against it.
Has anyone a workflow for this?
Related
I'm trying to run my meteor app's tests on Codeship CI, but without luck. I don't have any UI tests, I'm just testing the functionality of some Meteor methods. All the tests run locally successfully, but I can't find a proper way to kick off tests on CI.
Anyone knows any good links or setup on how I can achieve this?
If needed, I can change CI.
I want to get the coverage report for my Protractor E2E UI tests against a running node code.
I have tried the following steps:
Using Istanbul, I instrumented the code on one of my app server
managed through Nginx.
istanbul instrument . --complete-copy -o instrumented
Stopped the actual node code, and started instrumented code on the
same port (port 3000), without changing the NGINX config, so that any
traffic hitting that app server will be directed to the instrument
code which is running on the same server.
Run the protractor end to end tests which is on another machine. This is another local machine, which I run the test from and the instrumented app is in another server.
At the end of the run, I stopped the Instrumented code
Now:
- There is no coverage variable available
- There is no Coverage Folder
- No report generated
I thought the coverage report would be generated if the instrumented code was hit through the protractor script.
I also googled around, and found some plugin "protractor-istanbul-plugin" but not sure if this is what I should use.
My questions:
Is it even possible to generate coverage report if the instrumented code is in a different server and protractor script is run from a different machine?
If possible, is my assumption that report would be generated if instrumented code is hit is wrong?
should I use istanbul cover command here and if yes, how?
My goal is to instrument the code after deploying to QA env. and trigger the protractor script which is placed in another machine pointing to the QA env having the instrumented code.
Thanks in Advance.
I am writing e2e Tests for some JS application at the moment. Since I am not a JS developer I was investigating on this theme for a while and ended up with the following setup:
Jasmine2 as testing framework
grunt as "build-tool"
protractor as test runner
jenkins as CI server (already in use for plenty java projects)
Although the application under tests is not written in angular I decided to go for protractor, following a nice guide on howto make protractor run nicely even without angular.
Writing some simple tests and running them locally worked like a charm. In order to implicitly wait for some elements to show up in den DOM I used the following code in my conf.js:
onPrepare: function() {
browser.driver.manage().timeouts().implicitlyWait(5000);
}
All my tests were running as expected and so I decided to go to the next step, i.e. installation in the CI server.
The development team of the aplication I want to tests was already using grunt to build their application so I decided to just hook myself into that. The goal of my new grunt task is to:
assemble the application
start a local webserver running the application
run my protractor test
write some test reports
Finally I accomplished all of the above steps, but I am dealing with a problem now I cannot solve and did not find any help googling it. In order to run the protractor test from grunt I installed the grunt-protractor-runner.
The tests are running, BUT the implicit wait is not working, causing some tests to fail. When I added some explicit waits (browser.sleep(...)) everything is ok again but that is not what I want.
Is there any chance to get implicitly waiting to work when using the grunt-protractor-runner?
UPDATE:
The problem does not have anything to do with the grunt-protractor-runner. When using a different webserver I start up during my taks its working again. To be more precisley: Using the plugin "grunt-contrib-connect" the tests is working using the plugin "grunt-php" the test fails. So I am looking for another php server for grunt now. I will be updating this question.
UPDATE 2:
While looking for some alternatives I considered and finally decided to mock the PHP part of the app.
Trying to setup my integration flow and I have some tests that are quite destructive using the velocity-cucumber package.
First issue I find is that these tests are being run on the standard Meteor db. Which on localhost and dev is fine, but not so great for production. As far as I can tell the velocity-cucumber doesn't do anything with mirrors yet.
Because of this I have two cases where I need Meteor to launch in a specific way.
1) On the CI server I need for JUST the tests to run then exit (hopefully with the correct exit code).
2) On the production server I need Meteor to skip all tests and just launch.
Is this currently possible with Meteor command line arguments? I'm contemplating making demeteorize a part of the process, and then use standard node.js testing frameworks.
To run velocity tests and then exit, you can allegedly run meteor with the --test option:
meteor run --test
This isn't working for me, but that's what the documentation says it is supposed to do.
To disable velocity tests, run meteor with the environment variable VELOCITY set to 0. This will skip setting up the mirror, remove the red/green dot, etc.:
VELOCITY=0 meteor run
I can invoke mvn cobertura:cobertura to instrument, run unit tests (using surefire), and generate reports.
I can invoke mvn verify to run unit tests and integration tests (using the failsafe Maven plugin).
But how do I call Maven to instrument, run unit tests and integration tests, and generate reports? The answer to Running integration tests with Cobertura Maven plugin did not work for me, and also I would not want to call verify with every Cobertura run, but only for nightly coverage.
You can try Jacoco and got on fly instrumentation with more flexible configuration for gathering of coverage and reporting
Not sure if I fully understand the question, but I always do a mvn site ...