Integration testing with XCtest - integration-testing

I'm trying to write some Unit and Integration testing using XCtest.
I have a part that I'm not sure if it is a unit test or Integration test :
I need to test uploading one document using an upload service .
My question : Since my function is going to interact with a different service "upload service" so I'm thinking that it is an Integration test more than a unit test.
Could anyone please confirm ?

Related

Test Automation Setup in CI/CI Pipelines

My organization is looking to implement CI/CD in our deployment process and I've been tasked to setup the test automation part in the pipeline. I did a lot of reading and I've got an idea on how the setup should look like. Can help to go through my idea and advice if it's indeed the correct setup and give some ideas on improvements if needed?
Here's how I have pictured the setup in my mind. My organization have 3 environments, Development, Testing and Production. Test automation scripts in the pipeline will be executed in the Testing environment.
Dev to check in the code in the Development environment.
This will then trigger the deployment to Testing environment
Once the deployment is complete, it will then trigger the test automation scripts in the repo via the test suites id
Each pipeline will point to specific test suite id
The automation scripts will be used to perform regression testing to ensure the new changes doesn't break existing functionalities.
At the same time, testers will be doing manual testing in the test environment to ensure the new changes works as it's intended.
If both the test automation scripts and the manual testing passes, we can then proceed to the next stage. (UAT Approval and then PROD deployment)
Would this be the correct process flow?
Many thanks in advance!

How to implement Jira Xray + Robot Framework?

Hello im a new junior test software and i've been asked to study about xray and robot framework and how to implement both.
I've made a few Test cases in xray and after i've started to learn about robot framework and till there was all good.
Now i've been trying to implement the results of this tests cases that i made on robot to the tests execution in xray but everytime that i try to import the output.xml from robot to xray instead of "syncronize" this tests the xray creats me new tests care with the result of the robot.
There is anyone around that has done it before that could help me? i've tryed to implement tags in robot or even use the same name of tests (in xray and robot) but it didnt work. Thanks in advance.
I recommend using Jenkins with the XRay - Jira plugin to sync the results of automated tests into xray test items.
You would use a Tag in robot to link a test case to an Xray Test item or if you don't specify an ID, the plugin would create a new Test item and keep it updated based on name
*** Test Cases ***
Add Multiple Records To Timesheet By Multi Add Generator
[Tags] PD-61083
Check this link for details on how to configure the integration
https://docs.getxray.app/display/XRAY/Integration+with+Jenkins
The plugin can keep track of the execution in a specific Test Execution item or create one per run but should keep referring to the same Test item.
When you upload the RF results, Xray will auto-provision Test issues, one per each Robot Framework's Test Case. This is the typical behavior, which you may override in case you want to report results against an existing Test issue. In that case, you would have a Test in Jira and then you would add a tag to the RF Test Case entry, with the issue key of the existing Test issue.
However, taking advantage of auto-provisioning of Tests is easier and is probably the most used case. Xray, will only provision/create Test issues if they don't exist; for this, Xray tries to figure out if a generic Test exists, having the same definition (i.e. name of RF Test suites plus the Test Case name). If it does find it, then it will just report results (i.e. create a Test Run) against the existing Test issue.
If Test issues are always being created each time you submit the test results, that's an unexpected behavior and needs to be analyzed in more detail.
There is another entity to have in mind: Test Execution.
Your results will be part of a Test Execution. Every time that you submit test results, a Test Execution... unless, you specify otherwise. In the REST API request (or in the Jenkins plugin) you may specify an existing Test Execution by its issue key. If you do so, then the results will be overwritten on that Test Execution and no new Test Execution issue will be created. Think on it as reusing a given Test Execution.
How the integration works and the available capabilities are described in some detail within the documentation.
As an additional reference, let me also share this RF tutorial as it may be useful to you.

E2E test with Cypress, Nuxt and firebase

I have a project using Nuxt for front-end and firebase for back-end. I had written unit test for it by using Jest. But when write E2E test using Cypress, I can not find the way to mock data when calling firebase-function:
firebaseFunctions.httpsCallable(functionName)
I find a package is cypress-firebase but it doesn't include firebase-function.
Please help me how to write E2E test for system using fireabse as server-side or best way to E2E testing such as test with staging URL. Thank you so much.

How do I submit test results to Microsoft Test Managment Server via HTTP?

My company uses Microsoft Test Management server to host its tests and results. For the manual tests this works fine a QA engineer runs the test and marks its status, I have been tasked with writing some automated tests and I need them to submit results to the server. I know there is a code api, but I want to do this from a non .Net test environment (I am going to use AutoIt) so I would like to submit results from an HTTP api, how can I do this? Where can I find some good examples? Or is there a better way, we are a very MS TFS shop so whatever I do needs to fit into that environment.
Thank you!
Microsoft has a UI automation framework and API called CodedUI that is fully integrated with Microsoft Test Manager.
You can:
Generate test automations from Action recordings of manual tests
Generate test automations in Visual Studio
Code test automations from scratch in VS
These automations can then be associated with a Test Case in MTM and pushed to test environments manually or via an API. I usually have new code built, deployed, and tested automatically using these techniques.
You can also plug other UI frameworks into this model.
If you want to submit Test Results from a non-windows environment then you should use the Cross-Platform API. As part of Team Explorer Everywhere you get both a command line and a Java based Object Model for manipulating TFS.
http://www.microsoft.com/en-us/download/details.aspx?id=47727
I should note that the API for constructing test result submissions is quite complex as your tests in MTM are part of a hierarchy of Suits and Plans, and each Test Case can exist in more than one location. You will need to create a Test Run and populate it with the appropriate data.

How to use Sonar+JaCoCo to measure line coverage using integration tests (manual+automated)

I am trying to do line coverage analysis of a java based application. Found many resources on the internet on how to use Sonar+JaCoCo plugin to get line coverage results, and it looks very promising. However, I couldn't get a full clarity on how to go about implementing this solution.
More about my project:
There is a service being called by a website. The service is java based, and is built using maven.
There is also a selenium based test suite that is run on website (which makes calls to the above mentioned service at several instances). The test suite is built & invoked by Ant.
The code base for the service and the code base for the tests are at different locations on the same host.
I need to generate coverage report for the service based on the integration test suite.
The resources I went through are:
http://www.sonarsource.org/measure-coverage-by-integration-tests-with-sonar-updated/
http://www.eclemma.org/jacoco/trunk/doc/ant.html
Even after going through all of these, I am not sure where to put jacoco-agent.jar, whether to make jacoco a part of maven (service's build process) or ant (tests' build process), how to invoke jacoco agent, where to specify the source repository(service's code base) and test repository locations.
I have tried blind permutations of all of the above, but either the maven build or the ant build starts failing as soon as I add jacoco tasks to them.
Can someone please help me out in this? I need to understand the exact steps to follow to get it done.
When you execute your server process for the test mode, you need to ensure that jacoco agent is setup on the classpath. The jacoco agent will then effectively listen and record details of the code covered for the life time of the JVM.
You then execute your client side selenium tests which will invoke the server. The jacoco agent in this case will record details of the code executed as part of your tests. When the client test finishes, you need to shutdown your server process which should result in a jacoco coverage file.
The final step is to generate a jacoco html report based on your coverage report. I might suggest you look into moving your ANT based selenium tests into your maven pom, since then it will be easier to control the order of test execution.

Resources