Robot Framework Variable Class File with triple Nested Dictionary is not dot notation accessible - robotframework

Using the Robot Framework Documentation on Variable Files as a guide I implemented a Variable File Class with the get_variables. The basic example works as described.
When I implement a triple nested Dictionary (${A.B.C}) I can access the first two using ${A} and ${A.B} notation. However, when I want to access the third node ${A.B.C} the result is that I get an error
Resolving variable '${A.B.C}' failed: AttributeError: 'OrderedDict' object
has no attribute 'C'
In the below three examples I have an RF generated nested dictionary that I can access all nodes through the dot notation. The second example is a plain Python dictionary that is given back by the Variable Class. In the last example the Variable class returned is of the type OrderedDict.
Although the ${A['B']['C']['key']} works, it is more difficult to read in the code. When I load a similar yaml structure it supports the dot notation fully but this is not an option as the yaml file is static, and I require flexibility of Python for some of the key values.
So, I'm looking for some support on how to return a data structure that allows Robot Framework to interpret the full nested structure with the dot notation.
Variable Class File
from collections import OrderedDict
class OrderDict(object):
def get_variables(self):
C = OrderedDict([(u'key', u'value')])
B = OrderedDict([(u'C', C)])
A = OrderedDict([(u'B', B)])
D = {
u'E':
{
u'F':
{
u'key': u'value'
}
}
}
return OrderedDict([(u'DICT__A', A), (u'DICT__D', D)])
Robot Framework Script
*** Test Cases ***
Dictionary RF
${Z} Create Dictionary key=value
${Y} Create Dictionary Z=${Z}
${X} Create Dictionary Y=${Y}
Log To Console ${EMPTY}
Log To Console ${X}
Log To Console ${X['Y']['Z']['key']}
Log To Console ${X.Y}
Log To Console ${X.Y.Z}
Log To Console ${X.Y.Z.key}
Plain Dictionary Variable Class
Log To Console ${EMPTY}
Log To Console ${D}
Log To Console ${D['E']['F']['key']}
Log To Console ${D.E}
Log To Console ${D.E.F}
Log To Console ${D.E.F.key}
Ordered Dictionary Variable Class
Log To Console ${EMPTY}
Log To Console ${A}
Log To Console ${A['B']['C']['key']}
Log To Console ${A.B}
Log To Console ${A.B.C}
Log To Console ${A.B.C.key}
Robot Framework Console Log
Suite Executor: Robot Framework 3.0.2 (Python 2.7.9 on win32)
Dictionary RF
{u'Y': {u'Z': {u'key': u'value'}}}
value
{u'Z': {u'key': u'value'}}
{u'key': u'value'}
value
| PASS |
------------------------------------------------------------------------------
Plain Dictionary Variable Class
{u'E': {u'F': {u'key': u'value'}}}
value
{u'F': {u'key': u'value'}}
| FAIL |
Resolving variable '${D.E.F}' failed: AttributeError: 'dict' object has no attribute 'F'
------------------------------------------------------------------------------
Ordered Dictionary Variable Class
{u'B': OrderedDict([(u'C', OrderedDict([(u'key', u'value')]))])}
value
OrderedDict([(u'C', OrderedDict([(u'key', u'value')]))])
| FAIL |
Resolving variable '${A.B.C}' failed: AttributeError: 'OrderedDict' object has no
attribute 'C'

In the Robot Framework Slack channel Pekka Klarck pointed out that Robot Framework internally uses the robot.utils.DotDic class. Having get_variables() return a DotDic structure resolved my issue and I can now use the dot notation. Below is the code for the Variable Class DotDic (stored as DotDic.py).
from robot.utils import DotDict
class DotDic(object):
def get_variables(self):
G = {
u'H':
{
u'I':
{
u'key': u'value'
}
}
}
return {u'G': self.dict_to_dotdict(G)}
def dict_to_dotdict(self, dct):
dd = DotDict({})
for key, val in dct.items():
if isinstance(val, dict):
dd[key] = self.dict_to_dotdict(val)
else:
dd[key] = val
return dd

Related

path not being detected by Nextflow

i'm new to nf-core/nextflow and needless to say the documentation does not reflect what might be actually implemented. But i'm defining the basic pipeline below:
nextflow.enable.dsl=2
process RUNBLAST{
input:
val thr
path query
path db
path output
output:
path output
script:
"""
blastn -query ${query} -db ${db} -out ${output} -num_threads ${thr}
"""
}
workflow{
//println "I want to BLAST $params.query to $params.dbDir/$params.dbName using $params.threads CPUs and output it to $params.outdir"
RUNBLAST(params.threads,params.query,params.dbDir, params.output)
}
Then i'm executing the pipeline with
nextflow run main.nf --query test2.fa --dbDir blast/blastDB
Then i get the following error:
N E X T F L O W ~ version 22.10.6
Launching `main.nf` [dreamy_hugle] DSL2 - revision: c388cf8f31
Error executing process > 'RUNBLAST'
Error executing process > 'RUNBLAST'
Caused by:
Not a valid path value: 'test2.fa'
Tip: you can replicate the issue by changing to the process work dir and entering the command bash .command.run
I know test2.fa exists in the current directory:
(nfcore) MN:nf-core-basicblast jraygozagaray$ ls
CHANGELOG.md conf other.nf
CITATIONS.md docs pyproject.toml
CODE_OF_CONDUCT.md lib subworkflows
LICENSE main.nf test.fa
README.md modules test2.fa
assets modules.json work
bin nextflow.config workflows
blast nextflow_schema.json
I also tried with "file" instead of path but that is deprecated and raises other kind of errors.
It'll be helpful to know how to fix this to get myself started with the pipeline building process.
Shouldn't nextflow copy the file to the execution path?
Thanks
You get the above error because params.query is not actually a path value. It's probably just a simple String or GString. The solution is to instead supply a file object, for example:
workflow {
query = file(params.query)
BLAST( query, ... )
}
Note that a value channel is implicitly created by a process when it is invoked with a simple value, like the above file object. If you need to be able to BLAST multiple query files, you'll instead need a queue channel, which can be created using the fromPath factory method, for example:
params.query = "${baseDir}/data/*.fa"
params.db = "${baseDir}/blastdb/nt"
params.outdir = './results'
db_name = file(params.db).name
db_path = file(params.db).parent
process BLAST {
publishDir(
path: "{params.outdir}/blast",
mode: 'copy',
)
input:
tuple val(query_id), path(query)
path db
output:
tuple val(query_id), path("${query_id}.out")
"""
blastn \\
-num_threads ${task.cpus} \\
-query "${query}" \\
-db "${db}/${db_name}" \\
-out "${query_id}.out"
"""
}
workflow{
Channel
.fromPath( params.query )
.map { file -> tuple(file.baseName, file) }
.set { query_ch }
BLAST( query_ch, db_path )
}
Note that the usual way to specify the number of threads/cpus is using cpus directive, which can be configured using a process selector in your nextflow.config. For example:
process {
withName: BLAST {
cpus = 4
}
}

Access dictionary objects inside list using loop throwing error : list' object has no attribute 'keys or unicode' object has no attribute 'keys

For Robotframework, in given testcase code 1 and code 2 to access dictionary object. Problem is that when I use json.load to convert my json object which returns a list, returns json keys in single ' instead of double comma " object and when i don't use json.load it returns Unicode error
define library
*** Settings ***
Library OperatingSystem
Library Collections
Library HttpLibrary.HTTP
*** Test Cases ***
Code1
#get json file
${json_data}= Get file detail.json
#get dictionaries under list
${valuelist}= Get Json Value ${json_data} /alladdress/addresslist
# display it
log to console ${valuelist}
# loop over dictionaries under list
: FOR ${key} in #{valuelist.keys()}
\ ${value}= Get From Dictionary ${valuelist} ${key}
# getting AttributeError: 'unicode' object has no attribute 'keys
\ log to console ${key},${value}
Code2
# get json file
${json_data}= Get file detail.json
# get dictionaries under list
${valuelist}= Get Json Value ${json_data} /alladdress/addresslist
# use below line to avoid unicode error
${obj_list}= evaluate json.loads('''${valuelist}''') json
# display it
log to console ${obj_list}
# loop over dictionaries under list
: FOR ${key} in #{obj_list.keys()}
\ ${value}= Get From Dictionary ${obj_list} ${key}
# getting AttributeError: 'list' object has no attribute 'keys'
\ log to console ${key},${value}
here is json file
{
"class":{
"id":0,
"name":"David"
},
"alladdress":{
"count":3,
"addresslist":[
{
"houseno":1,
"streetno":5,
"streetname":"tesla",
"city":"ABC",
"state":"AA",
"country":"UK",
"zip":85555
},
{
"houseno":2,
"streetno":6,
"streetname":"honda",
"city":"PQR",
"state":"BB",
"country":"IN",
"zip":5252
}
]
}
}
In the HttpLibrary Library there is the Parse JSON keyword that is of use here. It can convert the string of the JSON document that is fetched using Get JSON Value into a dictionary.
So the value here is that you don't have to 'walk' the dictionary to get to the node you're looking for.
*** Settings ***
Library OperatingSystem
Library HttpLibrary.HTTP
*** Test Cases ***
Fetch Address List
${json_data}= Get file details.json
${addressesJSONstring} Get Json Value ${json_data} /alladdress/addresslist
${addresseslist} Parse Json ${addressesJSONstring}
: FOR ${addressDict} in #{addresseslist}
\ log ${addressDict['country']}
It appears that the keyword Get json value is returning strings rather than objects. If you replace that call with code that uses python's json module, you can then parse the data to find what you want.
For example, this will print out each address dictionary:
*** Test Cases ***
Code1
#get json file
${json_data}= Get file detail.json
#get dictionaries under list
${data}= evaluate json.loads($json_data) json
${alladdress}= get from dictionary ${data} alladdress
${addresslist}= get from dictionary ${alladdress} addresslist
# loop over dictionaries under list
log to console addresses:
: FOR ${address} in #{addresslist}
\ log to console ${address}

How can I divide a file into sections and put them in a dictionary using robot framework

I have a file that goes like this:
Name: John
Class: II
Age: 8
Interest: Sports
Name: Emma
Class: III
Hobby: Dance
So I want to read this file and put the contents into a dictionary with Name as key. The sections vary in number of lines. How can I achieve this dictionary using Robot Framework keywords.
Is this what you need?
*** Settings ***
Library OperatingSystem
Library String
Library Collections
*** Test Cases ***
Split File By Names
${my_dict} Create Dictionary
${data} Get File <path_to_your_data>
#{lines} Split To Lines ${data}
Remove Values From List ${lines} ${EMPTY}
:FOR ${line} IN #{lines}
\ ${key} ${value} Split String ${line} :
\ ${name} Set Variable If '${key}' == 'Name' ${value.strip()} ${name}
\ Run Keyword If '${key}' == 'Name' Set To Dictionary ${my_dict} ${name}=#{EMPTY}
\ Run Keyword If '${key}' <> 'Name' Append To List ${my_dict.${name}} ${line}
Log ${my_dict}
Anyway, the RF way to parse the file sucks. I would rather go for python.
#!/usr/bin/python
# -*- coding: utf-8 -*-
class ParseFile:
def __init__(self):
self.my_dict = {}
def parse_file_to_dict(self):
with open('<path_to_your_data>') as f:
lines = f.read().splitlines()
for line in (l for l in lines if l != ""):
key, value = line.split(":", 1)
if key == "Name":
name = value
self.my_dict[name] = []
else:
self.my_dict[name].append(line)
return self.my_dict
... and then just call it in RF.
*** Settings ***
Library ParseFile.py
*** Test Cases ***
Do It In Python
${my_dict} Parse File To Dict
Log ${my_dict}
Please note that both ways are strictly tight to data structure you provided. I.e. if "Name" is not at the first line of each section, it will not work and will need more handling with the data.

Robot.api used to regenerate output.xml to output.xml with only passed tests

I want to write script which based on output.xml generate me output_pass.xml without fails. My solution dosen't work:
class ExecutionStatus(ResultVisitor):
def visit_test(self, test):
if test.status == 'FAIL':
test= None
def rm_failed(inpath, outpath):
result = ExecutionResult(inpath)
result.visit(ExecutionStatus())
result.save(outpath)
rm_failed("output.xml", "passed.xml")
html (rebot passed.xml) generated on passed.xml contains all tests.

Custom command result

When invoking a custom command, I noticed that only the logs are displayed. For example, if my Custom Comand script contains a retrun statement return "great custom command", I can't find it in the result. Both in API Java client or shell execution cases.
What can I do to be able to retrieve that result at the end of an execution?
Thanks.
Command definition in service description file:
customCommands ([
"getText" : "getText.groovy"
])
getText.groovy file content:
def text = "great custom command"
println "trying to get a text"
return text
Assuming that you service file contains the following :
customCommands ([
"printA" : {
println "111111"
return "222222"
},
"printB" : "fileB.groovy"
])
And fileB.groovy contains the following code :
println "AAAAAA"
return "BBBBBB"
Then if you run the following command : invoke yourService printA
You will get this :
Invocation results:
1: OK from instance #1..., Result: 222222
invocation completed successfully.
and if you run the following command : invoke yourService printB
You will get this :
Invocation results:
1: OK from instance #1..., Result: AAAAAA
invocation completed successfully.
So if your custom command's implementation is a Groovy closure, then its result is its return value.
And if your custom command's implementation is an external Groovy file, then its result is its last statement output.
HTH,
Tamir.

Resources