I'm trying to create a test report with screenshot and videos through cypress. the test reports are show but the screenshots and video are not . does anyone know how I can do that.
I used "cypress-mochawesome-reporter"
https://www.npmjs.com/package/cypress-mochawesome-reporter
thanks a lot
You can attach screenshots to the report. There is even an option to attach screenshots of every retry if you have a retry policy in the Cypress tests. To do that just follow the steps from the documentation. Pay attention for 2 points:
If in your tests you are overriding 'before:run' or 'after:run' you should add a separate piece of code from the documentation
Don't forget to add import to the cypress/support/e2e.js. Without it you will see no errors, but no screenshots will be attached to the report
import 'cypress-mochawesome-reporter/register';
There is no possibility to attach a video for now. Mocha itself has only Test context. But in Cypress video is recorded as one file per Suite, not per test. That is why there is an open request to the Mocha to provide the Suite context to attach Cypress videos
https://github.com/adamgruber/mochawesome-report-generator/issues/150
Take a look at this example
In cypress.json
{
"reporter": "cypress-mochawesome-reporter",
"video": true,
"screenshotsFolder": "images"
}
Related
I want to have an option on the cucumber report to mute/hide scenarios with a given tag from the results and numbers.
We have a bamboo build that runs our karate repository of features and scenarios. At the end it produces nice cucumber html reports. On the "overview-features.html" I would like to have an option added to the top right, which includes "Features", "Tags", "Steps" and "Failures", that says "Excluded Fails" or something like that. That when clicked provides the same exact information that the overview-features.html does, except that any scenario that's tagged with a special tag, for example #bug=abc-12345, is removed from the report and excluded from the numbers.
Why I need this. We have some existing scenarios that fail. They fail due to defects in our own software, that might not get fixed for 6 months to a year. We've tagged them with a specified tag, "#bug=abc-12345". I want them muted/excluded from the cucumber report that's produced at the end of the bamboo build for karate so I can quickly look at the number of passed features/scenarios and see if it's 100% or not. If it is, great that build is good. If not, I need to look into it further as we appear to have some regression. Without these scenarios that are expected to fail, and continue to fail until they're resolved, it is very tedious and time consuming to go through all the individual feature file reports and look at the failing scenarios and then look into why. I don't want them removed completely as when they start to pass I need to know so I can go back and remove the tag from the scenario.
Any ideas on how to accomplish this?
Karate 1.0 has overhauled the reporting system with the following key changes.
after the Runner completes you can massage the results and even re-try some tests
you can inject a custom HTML report renderer
This will require you to get into the details (some of this is not documented yet) and write some Java code. If that is not an option, you have to consider that what you are asking for is not supported by Karate.
If you are willing to go down that path, here are the links you need to get started.
a) Example of how to "post process" result-data before rendering a report: RetryTest.java and also see https://stackoverflow.com/a/67971681/143475
b) The code responsible for "pluggable" reports, where you can implement a new SuiteReports in theory. And in the Runner, there is a suiteReports() method you can call to provide your implementation.
Also note that there is an experimental "doc" keyword, by which you can inject custom HTML into a test-report: https://twitter.com/getkarate/status/1338892932691070976
Also see: https://twitter.com/KarateDSL/status/1427638609578967047
I want to have an option on the cucumber report to mute/hide scenarios with a given tag from the results and numbers.
We have a bamboo build that runs our karate repository of features and scenarios. At the end it produces nice cucumber html reports. On the "overview-features.html" I would like to have an option added to the top right, which includes "Features", "Tags", "Steps" and "Failures", that says "Excluded Fails" or something like that. That when clicked provides the same exact information that the overview-features.html does, except that any scenario that's tagged with a special tag, for example #bug=abc-12345, is removed from the report and excluded from the numbers.
Why I need this. We have some existing scenarios that fail. They fail due to defects in our own software, that might not get fixed for 6 months to a year. We've tagged them with a specified tag, "#bug=abc-12345". I want them muted/excluded from the cucumber report that's produced at the end of the bamboo build for karate so I can quickly look at the number of passed features/scenarios and see if it's 100% or not. If it is, great that build is good. If not, I need to look into it further as we appear to have some regression. Without these scenarios that are expected to fail, and continue to fail until they're resolved, it is very tedious and time consuming to go through all the individual feature file reports and look at the failing scenarios and then look into why. I don't want them removed completely as when they start to pass I need to know so I can go back and remove the tag from the scenario.
Any ideas on how to accomplish this?
Karate 1.0 has overhauled the reporting system with the following key changes.
after the Runner completes you can massage the results and even re-try some tests
you can inject a custom HTML report renderer
This will require you to get into the details (some of this is not documented yet) and write some Java code. If that is not an option, you have to consider that what you are asking for is not supported by Karate.
If you are willing to go down that path, here are the links you need to get started.
a) Example of how to "post process" result-data before rendering a report: RetryTest.java and also see https://stackoverflow.com/a/67971681/143475
b) The code responsible for "pluggable" reports, where you can implement a new SuiteReports in theory. And in the Runner, there is a suiteReports() method you can call to provide your implementation.
Also note that there is an experimental "doc" keyword, by which you can inject custom HTML into a test-report: https://twitter.com/getkarate/status/1338892932691070976
Also see: https://twitter.com/KarateDSL/status/1427638609578967047
Goolge very kindly offer a sample dataset of Google Analytics data in BigQuery, so you can run some tests before setting up the actual export to BigQuery.
This is detailed at the following page https://support.google.com/analytics/answer/3416091?hl=en&ref_topic=3416089
However when you try to add the project google.com:analytics-bigquery as shown in step 5 of the guide. The following error is shown:
Project IDs may contain letters, numbers, and dashes, with an optional
"domain:" prefix.
As you can see the project ID conforms to this, but you are unable to submit the form.
Google is this just a bug?
Does anyone else have the same issue?
I am confirming - just tried and got the same.
Some-when recently - this dialog form was changed (added options to choose from - display in nav vs. make a current project) and looks like the bug was introduced. previous versions didn't have this bug.
At the same time - confirming that nothing wrong with project name itself - with the internal tool that I am using - I was able to add this project successfully
You should report this bug in BigQuery issue tracker.
Thanks for reporting this issue. We actually already have a fix checked in for this, but we are currently waiting for an opportunity to push this to production, hopefully soon.
In the meantime, you can try one of these workarounds:
navigate directly to the project: https://bigquery.cloud.google.com/queries/<project_id>
navigate to a dataset on the project, which will display it in the left nav: https://bigquery.cloud.google.com/dataset/<project_id>:<dataset_id>
I am new to robot framework. I want to know how to capture screenshots on failure.
Doesnt robot framework automatically take screenshots if script fails?
An example will be of great help!
this is actually a feature of the Selenium2Library that would be required with Robot if you were doing Selenium based tests.
More information can be found here: http://robotframework.org/Selenium2Library/doc/Selenium2Library.html
As it says it the documentation, setting up screenshots on failure is very easy, for example here is mine from a test suite I'm working with:
Library Selenium2Library timeout=10 implicit_wait=1.5 run_on_failure=Capture Page Screenshot
You can use the below keyword to capture screen shot after any of the step you want :
Capture Page Screenshot
Hope so this was helpful!
In this case teardown will be executed once the test case is executed/not executed and if the test case fail, it will take screenshot:
[Teardown] Run Keyword If Test Failed Capture Page Screenshot
Or you can do it even better on suite level if you don't need different teardowns for particular tests:
[Test Teardown] Run Keyword If Test Failed Capture Page Screenshot
So far, all of the other answers assume that you are using Selenium
If you are not, there is a "Screenshot" library that has the keyword "Take Screenshot." To include this library, all you need to do is put "Library Screenshot" in your settings table.
In my robotframework code, my teardowns all just reference a keyword I made called "Default Teardown" which is defined as:
Default Teardown
Run Keyword If Test Failed Take Screenshot
Close All
(I think that the Close All might be one of my custom keywords).
I have noticed a few issues with the Take Screenshot keyword. Some of this may be configurable, but I don't know. First off, it will take a screenshot of your screen, not necessarily just the application that you are interested in. So if you're using this and are allowing other people to view the resulting screenshots, make sure that you don't have anything else on your screen that you wouldn't want to share.
Also, if you kick off your tests and then lock your screen so you can take a quick break while it runs, all of your screenshots will just be pictures of your lock screen.
I'm using this on my Jenkins server as well which is using the xvfb-run command to create sort of a fake GUI to run the robot framework tests. If you're doing this as well, make sure that in your xvfb-run command you include something along the lines of
xvfb-run --server-args="-screen 0 1024x768x24" <rest of your command>
You'll have to decide what screen resolution works the best for you, but I found that with the default screen resolution, only a small portion of my app was captured.
Long story short, I think that you're better off using Capture Page Screenshot if you're using selenium. However if you're not, this may be your best (or only) solution.
I've got some github projects which I want to test with code coverage. The only way I found (see blog post) to achieve this is to write a custom script that counts code coverage XML lines and outputs Code coverage is 74.32%, which is below the accepted 80%. Displaying code coverage in HTML is way better, but is it possible in travis-ci?
You can use https://coveralls.io/ together with Travis to display coverage nicely. Example can be found here: https://coveralls.io/r/phpmyadmin/error-reporting-server
PS: I know this is quite old question, but I've found it just now when searching for something else.
Travic CI doesn't support any persistent storage. One suggestion would be to create a custom script and run phpunit --coverage-html, next send the contents of the output dir to your own server using something like rsync.