Postman: data retrieved from an input file cannot be logged in console - console

I am using Postman for Windows Version 6.5.2.
Whenever I use an input file with variables, I would like to see variables I use in current test case printed out to console. For example I have a data file with list of user id:s. Then, at some point in my tests, I would like to send a simple message to console: "INFO: logging with user id XXXX."
I have tried assigning data to both environment and global variables. It seems not working neither in "Pre-request Script" section, neither in "Tests" section. In my case, if I have an initial value defined, then this value is being printed out all way around (despite that Postman takes different values from file for each iteration). If no value is defined (tested with both environment and global ones), then I get an empty string printed out.

Using the console.log(pm.iterationData.toObject()) statement in the Tests tab would log an object, containing the data from the file used in the request.
My sample JSON data file:
[
{
"item":"1",
"item2": "Value 1"
},
{
"item":"2",
"item2": "Value 2"
},
{
"item":"3",
"item2": "Value 3"
},
{
"item":"4",
"item2": "Value 4"
}
]
This would log the following, when running the Collection from the Runner:

Related

Using reference funtion in an ARM template parameter file

Is there anyway to use the reference funtion in an ARM parameter file? I understand the following can be used to retrieve the intrumentation key of an app insights instance but this doesnt seem to work in a parameter file.
"[reference('microsoft.insights/components/web-app-name-01', '2015-05-01').InstrumentationKey]"
I currently set a long list of environment variables using an array from a parameter file and need to include the dynamic app insights instrumentation key to that list of variables.
Unfortunately, no.
Reference function only works at runtime. It can't be used in the parameters or variables sections because both are resolved during the initial parsing phase of the template.
Here is an excerpt from the docs and also how to use reference correctly:
You can't use the reference function in the variables section of the template. The reference function derives its value from the resource's runtime state. However, variables are resolved during the initial parsing of the template. Construct values that need the reference function directly in the resources or outputs section of the template.
Not in a param file... it's possible to simulate what you want by nested a deployment if that's an option. So your param file can contain the resourceId of the insights resource and then a nested deployment can make the reference call - but TBH, probably easier to fetch the key as a pipeline step (or similar).
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"insightsResourceId": {
"type": "string",
"defaultValue": "'microsoft.insights/components/web-app-name-01'"
}
},
"resources": [
{
"apiVersion": "2018-02-01",
"type": "Microsoft.Resources/deployments",
"name": "nestedDeployment",
"properties": {
"mode": "Incremental",
"parameters": {
"instrumentationKey": {
"value": "[reference(parameters('insightsResourceId'), '2015-05-01').InstrumentationKey]"
}
},
"template": {
// original template goes here
}
}
}
]
}
Way 1
Use the reference function in your parameter file for resources that are already deployed in another template. For that you have to pass the ApiVersion parameter. Refer MsDoc. which follows:
"value": "[reference(resourceId(variables('<AppInsightsResourceGroup>'),'Microsoft.Insights/components', variables('<ApplicationInsightsName>')), '2015-05-01', 'Full').properties.InstrumentationKey]"
You need to change the property that you are referencing from '.InstrumentationKey' to '.properties.InstrumentationKey'.
Refer to Kwill answer for more information.
Way 2
Get the Content of parameter file in PowerShell variable/Object using
$ParameterObject = Get-Content ./ParameterFileName.json
Update the Parameter file values using
#Assign the parameter values using
$ParameterObject.parameters.<KeyName>.value = "your dynamic value"
Pass $parameterObject to -TemplateParameterObject parameter
Refer here
Way 3
You have to Add/Modify the parameter file values using (PowerShell/ Dev lang (like Python, c#,...) ). After changing the parameter file try to deploy it.

How to force AWS Console Item Explorer to show all columns?

When I query in the AWS Web Console's Item explorer for one of my dynamodb "tables", the resulting document does not show one of the "columns" that exists in the "record" (or doesn't show one of the "properties" that exists in the "document" if you prefer the document store terminology).
How do I make it show all the columns??
e.g. the following is the result from querying the dynamodb in the aws cli, but in the Item explorer of the aws console (on the web), the thisThingsMissingInConsole "column" isn't shown (and neither is it available in the Select visible columns preference)
{
"Items": [
{
"email": {
"S": "my#e.mail"
},
"thisThingsMissingInConsole": {
"SS": [
"a",
"b",
"c"
]
},
...
},
...
]
}
In my case, when confirming via the CLI, i accidentally checked the record in a different environment than I was browsing in the web console 🤦‍♂️. So in my case it turns out that the property that isn't shown is actually missing 🤬.

AWS Step Functions: Filter an array using JsonPath

I need to filter an array in my AWS Step Functions state. This seems like something I should easily be able to achieve with JsonPath but I am struggling for some reason.
The state I want to process looks like this:
{
"items": [
{
"id": "A"
},
{
"id": "B"
},
{
"id": "C"
}
]
}
I want to filter this array by removing entries for which id is not in a specified whitelist.
To do this, I define a Pass state in the following way:
"ApplyFilter": {
"Type": "Pass",
"ResultPath": "$.items",
"InputPath": "$.items.[?(#.id in ['A'])]",
"Next": "MapDeployments"
}
This makes use of the JsonPath in operator.
Unfortunately when I execute the state machine I receive an error:
{
"error": "States.Runtime",
"cause": "An error occurred while executing the state 'ApplyFilter' (entered at the event id #8). Invalid path '$.items.[?(#.id in ['A'])]' : com.jayway.jsonpath.InvalidPathException: com.jayway.jsonpath.InvalidPathException: Space not allowed in path"
}
However, I don't understand what is incorrect with the syntax. When I test here everything works correctly.
What is wrong with what I have done? Is there another way of achieving this sort of filter using JsonPath?
According to the official AWS docs for Step Functions,
The following in paths are not supported # .. , : ? *
https://docs.aws.amazon.com/step-functions/latest/dg/amazon-states-language-paths.html

Load JSON File into Robot Framework

I am trying to load a JSON file and use the values to perform some actions based on my tests. I tried to load the json value which I think I got right, but when trying to log the output, I got error message:
Resolving variable '${qa["REQUEST_ID"]}' failed: TypeError: list indices must be integers or slices, not str
Not exactly sure what this means since I am new to Robot Framework. This is what I did to load and log the values:
${file} Get File ${CURDIR}/RequestIDs.json
${qa} Evaluate json.loads('''${file}''') json
Log To Console ${qa["REQUEST_ID"]}
Json file looks something like:
[
{
"REQUEST_ID" : 10513
},
{
"REQUEST_ID" : 48156
},
{
"REQUEST_ID" : 455131
}
]
So basically I want to get the "REQUEST_ID" value and type that in a text field.
Look at the structure of your json - it's a list of dictionaries; so you have to first specify which list member you want, and then its REQUEST_ID field:
Log To Console ${qa[0]["REQUEST_ID"]
# print the value from all present dictionaries in the list:
FOR ${member} IN #{qa}
Log To Console ${member["REQUEST_ID"]
END

apache airflow variables on startup

I'm learning Airflow and am planning to set some variables to use across different tasks. These are in my dags folder, saved as configs.json, like so:
{
"vars": {
"task1_args": {
"something": "This is task 1"
},
"task2_args": {
"something": "this is task 2"
}
}
}
I get that we can enter Admin-->Variables--> upload the file. But I have 2 questions:
What if I want to adjust some of the variables while airflow is running? I can adjust my code easily and it updates in realtime but it doesn't seem like this works for variables.
Is there a way to just auto-import this specific file on startup? I don't want to have to add it every time I'm testing my project.
I don't see this mentioned in the docs but it seems like a pretty trivial thing to want.
What you are looking for is With code, how do you update an airflow variable?
Here's an untested snippet that should help
from airflow.models import Variable
Variable.set(key="my_key", value="my_value")
So basically you can write a bootstrap python script to do this setup for you.
In our team, we use such scripts to setup all Connections, and Pools too
In case you are wondering, here's the set(..) method from source
#classmethod
#provide_session
def set(
cls,
key: str,
value: Any,
serialize_json: bool = False,
session: Session = None
):
"""
Sets a value for an Airflow Variable with a given Key
:param key: Variable Key
:param value: Value to set for the Variable
:param serialize_json: Serialize the value to a JSON string
:param session: SQL Alchemy Sessions
"""
if serialize_json:
stored_value = json.dumps(value, indent=2)
else:
stored_value = str(value)
Variable.delete(key, session=session)
session.add(Variable(key=key, val=stored_value))
session.flush()

Resources