how to write custom shell executable VTS test using the binary test template - android-vts

I am trying to create custom testcases using the VTS binary test template. But the codelab android pages do not describe how to incorporate shell executable tests into VTS framework using Binary test template. Is this even possible?
I have successfully created custom C/C++ tests using the same binary test template given as example in codelab

I assume you created
an Android.bp with a cc_test type binary called MyVtsTestBinary,
a corresponding AndroidTest.xml test configuration,
and an Android.mk test module configuration like so:
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := MyVtsTestName
include test/vts/tools/build/Android.host_config.mk
vts-tradefed will expect your test binary and all required libraries to be located in $ANDROID_HOST_OUT/vts/android-vts/testcases. Your binaries will be copied there if you add them to the target_native_modules in test/vts/tools/build/tasks/vts_package.mk.
target_native_modules := \
[...]\
MyVtsTestBinary
You can check whether you test is known to VTS with:
vts-tradefed list modules

Related

Access Metadata set from cmd line in RobotFramework

In case of setting Metadata for top-level test suites with the --metadata command line option (as described here) I don't see any working variants of accessing Metadata items (via &{SUITE METADATA} automatic variable, as mentioned here) within the Test Suite.
Namely, when running
pybot --metadata prettyMetaName:someMetaValue ...
trying to get key prettyMetaName in the test suite setup with &{SUITE METADATA}[prettyMetaName], I get this error:
Parent suite setup failed:
Dictionary variable '&{SUITE METADATA}' has no key 'prettyMetaName'.
More detailed part of the test:
*** Keywords ***
Custom Setup
Log &{SUITE METADATA}[prettyMetaName] level=WARN
*** Settings ***
Suite Setup Custom Setup
But if I try to get metadata via python library Listener API, I'm getting the valid result.
On the other hand, in case of explicit declaring Metadata in the Settings section, everything works as expected.
I'm using Robot 3.0.4.
I think you need an __init__.robot file in the root folder of your robot project with the following content.
*** Settings ***
Suite Setup Store Top Suite Metadata
*** Keywords ***
Store Top Suite Metadata
Set Suite Variable ${TOP SUITE METADATA} ${SUITE METADATA} children=True
Then you can use the ${TOP SUITE METADATA} variable everywhere else to get access to the metadata set by command line arguments.

Import custom library from a path using java in robot framework test cases

I have created a custom library consisting of robot keywords. In order to use these keywords i have to specify
Library abc.xyz.<Class_name>
This however does not look clean. I want to just have
Library <class_name>
which seems the standard way. How do I get this to work?
you can specify the path till your custom library in PYTHONPATH environment variables
and then use it like
*** Settings ***
Library abc.java
For more options and information you can refer to below answer as well
Import custom library from a different path in Robot Framework
The name of the library in Robot Framwork consists out of two parts:
Library <Package Path>.<Class_name>
In many cases this means something like:
Library org.company.application.<Class_name>
This is often reflected in the Java Code as:
package org.company.application;
public class SampleKeywordLibrary {
In order to only use the class name in Robot Framework it will be necessary to have no Package Path defined in your Class. Depending upon your editor it may require some changes in your project settings as well.
Java:
public class SampleKeywordLibrary {
public static final String ROBOT_LIBRARY_SCOPE = "GLOBAL";
public void MyCustomJavaKeyword() {
}
}
Then the Robot File looks like:
*** Settings ***
Library SampleKeywordLibrary
*** Test Cases ***
TC
My Custom Java Keyword
When you have exported/compiled it to a Jar file, place it where you want to store it and start Robot Framework from within the Jython context similar to this:
C:\Python\jython2.7.0\bin\jython.exe -
J-Dpython.path=C:\Python\jython2.7.0\Lib\site-packages
-J-cp .;C:\TA\Workspace\StackOverflowJython\SampleKeywordLibrary.jar
-m robot.run
-s StackOverflowJython.Test
C:\TA\Workspace\StackOverflowJython

Concourse CI and Build number

I'm moving from Jenkins to using using Concourse CI to run my Sauce labs e2e tests. Sauce labs groups tests together that have the same build number string:
name: 'Chrome XS',
browserName: 'chrome',
tunnelIdentifier: process.env.TUNNEL_IDENTIFIER,
build: process.env.JENKINS_BUILD_NUMBER,
platform: 'Windows 10',
shardTestFiles: true,
maxInstances: 20,
How can I pass the build number to my script using an environment variable as shown above. The Concourse GUI uses name #number. Is there any way to retrieve this. I tried printing all the environment variables in the docker container but it's not set by default.
Metadata like the build number/ID are intentionally not provided to tasks. See https://concourse-ci.org/implementing-resources.html#resource-metadata
This sounds like potentially a use case for a Sauce Labs resource?
In Concourse, build metadata is only available for resources, not tasks.
An example on using build metadata with resources is to include it as part of build results notification emails. The following blog entry contains more information about it:
http://lmpsilva.typepad.com/cilounge/2016/10/how-to-insert-build-metadata-into-user-notifications-in-concourse.html
If you really want to use build number for versioning, you could try to create your own Concourse resource that would return the version number, however, I would use your code commit number instead. Another alternative would be to use the Semver resource in Concourse: https://github.com/concourse/semver-resource

PHPUnit + Sonarqube exported coverage do not match with actual xml result

sonarqube: 5.1.2
sonar-runner: 2.2.1
php plugin: 2.6
PHPUnit 4.2.6
We're running phpunit over our application but I'm not able to get the correct coverage % on sonar as expected.
In my phpunit.xml we have filters defining folders that we only want to cover.
<whitelist addUncoveredFilesFromWhitelist="false">
<directory>./site-library-php/src/main/php/BabelCentral/Model/Content</directory>
</whitelist>
In my sonar properties
sonar.modules=php-module
php-module.sonar.phpUnit.coverage.analyzeOnly=true
php-module.sonar.phpUnit.analyzeOnly=true
php-module.sonar.php.tests.reportPath=site-main-php/src/test/target/reports/phpunit.xml
php-module.sonar.php.coverage.reportPath=site-main-php/src/test/target/reports/phpunit.coverage.xml
Viewing the log on jenkins, tests seem to run fine and ends with:
Tests: 1479, Assertions: 4165, Failures: 58, Incomplete: 14, Skipped: 55.
Generating code coverage report in Clover XML format ... done
Generating code coverage report in HTML format ... done
// further down
11:47:51.973 INFO - Analyzing PHPUnit test report: site-main-php/src/test/target/reports/phpunit.xml with org.sonar.plugins.php.phpunit.PhpUnitResultParser#35e7e715
11:47:52.604 INFO - Analyzing PHPUnit unit test coverage report: site-main-php/src/test/target/reports/phpunit.coverage.xml with PHPUnit Unit Test Coverage Result Parser
All seems to be working well. However, sonar would report a different result, covering the entire source folder. These are the number shown on the dashboard.
Unit Tests Coverage 1.8%
line coverage 1.8%
Unit test success 95.9%
Is there anyway to have sonar respect phpunit's configured filter?
Note:
I'm able to achieve the desired numbers on coverage if I explicitly set
php-module.sonar.sources
to the directories/files that I want. It's just that, comma-separated config is hard to manage than phpunit's xml config.
Defining the set of source files which should be analysed by SonarQube is crucial: it's on this set of files that coding rules are applied and that metrics are computed.
sonar.sources is therefore a mandatory property and is the main way to configure the set of source files.
Other properties may be used to refine the set of source files:
You may configure sonar.tests: test files are automatically excluded from sonar.sources.
You may have to exclude some other PHP files like dependencies or generated code with sonar.exclusions.
You should also be aware that several plugins may contribute to the global coverage of the project: if you installed the JavaScript plugin and if sonar.sources contains JavaScript files, they will also be taken into account when calculating the coverage metrics.
If you're only interested in adjusting coverage, you can exclude files from coverage metrics with sonar.coverage.exclusions.

Pass an environment variable into SBT to use in a Specs2 test?

What is the correct way of passing in an environment variable into SBT so that it can be accessed using Specs2? (And then retrieving the value in Specs2.) The environment variable will contain an API key to use for testing.
It needs to be an environment variable to work with Travis CI's encrypted environment variable functionality[1]
My setup:
SBT 0.13.0
Specs2 2.3.4
Travis CI
Edit: bonus points if somebody can link to an open-source repo that does this. There must be a few!
[1] Using secret api keys on travis-ci
I guess that you can encrypt your key with the travis api and get:
xxxEncryptedxxx
Then you can use the CommandLineArguments trait to pass arguments from the command-line in SBT to your specification.
In .travis.yml
sbt ++$TRAVIS_SCALA_VERSION testOnly *MySpec* -- key xxxEncryptedxxx
In MySpec.scala
class MySpec extends mutable.Specification with CommandLineArguments {
"this is an API test" >> {
arguments.commandLine.value("key").map { k =>
callApi(k) must beOk
}.getOrElse(ko("you need to pass a key on the command line"))
}
}
From you questions, I presume you're looking to pass secure environment variables using Travis's built-in support for encryption?
If so the environment variable is set before SBT is run, so it should be available to all processes. I don't use Specs, but the standard JVM way to get environment variable is to use System.getenv(String). It's possible that sbt deletes the environment variables before running Specs; if that's true then fixing that has to be done in your build.sbt somehow, and isn't specific to Travis.

Resources