How to use the serverless environment variable in stepfunction parameter - aws-serverless

I have a query with hardcoded dates used in the parameters section.Instead I want to pass them as environment variables.Any suggestions on how to parameterize the QueryString parameter?
service: service-name
frameworkVersion: '2'
provider:
name: aws
runtime: go1.x
lambdaHashingVersion: 20201221
stage: ${opt:stage, self:custom.defaultStage}
region: us-east-1
tags: ${self:custom.tagsObject}
logRetentionInDays: 1
timeout: 10
deploymentBucket: lambda-repository
memorySize: 128
tracing:
lambda: true
plugins:
- serverless-step-functions
configValidationMode: error
stepFunctions:
stateMachines:
callAthena:
name: datasorting-dev
type: STANDARD
role: ${self:custom.datasorting.${self:provider.stage}.iam}
definition:
Comment: "Data Refersh"
StartAt: Refresh Data
States:
Refresh Data:
Type: Task
Resource: arn:aws:states:::athena:startQueryExecution.sync
Parameters:
QueryString: >-
ALTER TABLE table.raw_data ADD IF NOT EXISTS
PARTITION (YEAR=2021, MONTH=02, DAY=15, hour=00)
WorkGroup: primary
ResultConfiguration:
OutputLocation: s3://output/location
End: true

you can replace any value in your serverless.yml enclosed in ${} brackets,
Serverless Framework Guide to Variables:
https://www.serverless.com/framework/docs/providers/aws/guide/variables/
for example, you can create a custom: section looking for environment variables, and if they are not present, you can have default values:
service: service-name
frameworkVersion: '2'
custom:
year: ${env:YEAR, 'default-year'}
month: ${env:MONTH, 'default-month'}
day: ${env:DAY, 'default-day'}
hour: ${env:HOUR, 'default-hour'}
stepFunctions:
stateMachines:
callAthena:
...
Parameters:
QueryString: >-
ALTER TABLE table.raw_data ADD IF NOT EXISTS
PARTITION (YEAR=${self:custom.year}, MONTH=${self:custom.month}, DAY=${self:custom.day}, hour=${self:custom.hour})
...

Related

How to retrieve dictionary value when creating key from set_fact

I am trying to retrieve a value from a dictionary by 'dotting' through the keys with a variable (using set_fact).
How do I retrieve the value for applications.office.nuspec.id if i built it through set_fact?
Here is my dictionary
vars:
applications:
office:
nuspec:
id: data_wanted
Here is the code put together with "current_chocolatey_parameter_value" storing the location of the dictionary value I want
- name: Set variable to id
set_fact: selected_current_chocolatey_parameter=id
- name: Create string to represent variable to select value from dictionary
set_fact: current_chocolatey_parameter_value="applications.office.nuspec.{{ selected_current_chocolatey_parameter }}"
- name: The combined new string is
debug: msg="{{ current_chocolatey_parameter_value }}"
TASK [The combined new string is]
******************************************************************************************
***********************************************************
ok: [localhost] => {
"msg": "applications.office.nuspec.id"
}
I have tried lookup('vars', current_chocolatey_parameter_value) with no luck.
How do I get the value of applications.office.nuspec.id from my defined dictionary when I have the string stored in a variable that I want to reference it with?
---
- name: Test lookup play
hosts: localhost
connection: local
gather_facts: false
vars:
applications:
office:
nuspec:
id: data_wanted
tasks:
- name: Set variable to id
set_fact: selected_current_chocolatey_parameter=id
- name: Create string to represent variable to select value from dictionary
set_fact: current_chocolatey_parameter_value="applications.office.nuspec.{{ selected_current_chocolatey_parameter }}"
- name: The combined new string is
debug: msg="{{ current_chocolatey_parameter_value }}"
- name: Looked up value 1
debug: msg="{{ applications.office.nuspec[selected_current_chocolatey_parameter] }}"
- name: Looked up value 2
debug: msg="{{ applications.office.nuspec.get(selected_current_chocolatey_parameter) }}"
Produces
$ ansible-playbook -i localhost, stackoverflow1.yml
PLAY [Test lookup play] *******************************************************************************
TASK [Set variable to id] *****************************************************************************
ok: [localhost]
TASK [Create string to represent variable to select value from dictionary] ****************************
ok: [localhost]
TASK [The combined new string is] *********************************************************************
ok: [localhost] => {
"msg": "applications.office.nuspec.id"
}
TASK [Looked up value 1] ******************************************************************************
ok: [localhost] => {
"msg": "data_wanted"
}
TASK [Looked up value 2] ******************************************************************************
ok: [localhost] => {
"msg": "data_wanted"
}
Remember that you can call any core Python functions between "{{ ...}}"

HOT template for cinder volume with or without volume_type

I am trying to write a HOT template for Openstack volume, and need to have the volume_type as a parameter. I also need to support a case when the parameter is not given, and default to the Cinder default volume type.
First attempt was to pass null to the volume_type , hoping it would give the default volume type. However no matter what I pass (null, ~, default, "" ) , seems there is no way to get the default volume type.
type: OS::Cinder::Volume
properties:
name: test
size: 1
volume_type: { if: ["voltype_given" , {get_param:[typename]} , null] }
Is there any way to get the default volume type , when you have the "volume_type" property defined?
Alternatively, is there any way to have the "volume_type" property itself behind a conditional? I tried several ways, but no luck. Something like:
type: OS::Cinder::Volume
properties:
if: ["voltype_given" , [ volume_type: {get_param:[typename]} ] , ""]
name: test
size: 1
ERROR: TypeError: : resources.kk-test-vol: : 'If' object is not iterable
Could you do something like this?
---
parameters:
typename:
type: string
conditions:
use_default_type: {equals: [{get_param: typename}, '']}
resources:
MyVolumeWithDefault:
condition: use_default_type
type: OS::Cinder::Volume
properties:
name: test
size: 1
MyVolumeWithExplicit:
condition: {not: use_default_type}
type: OS::Cinder::Volume
properties:
name: test
size: 1
volume_type: {get_param: typename}
# e.g. if you need to refer to the volume from another resource
MyVolumeAttachment:
type: OS::Cinder::VolumeAttachment
properties:
instance_uid: some-instance-uuid
volume_id:
if:
- use_default_type
- get_resource: MyVolumeWithDefault
- get_resource: MyVolumeWithExplicit

Not able to execute lifecycle operation using script plugin

I'm trying to learn how to use script plugin. I'm following script plugin docs here but not able to make it work.
I've tried to use the plugin in two ways. The first, when cloudify.interface.lifecycle.start operation is mapped directly to a script:
tosca_definitions_version: cloudify_dsl_1_3
imports:
- 'http://www.getcloudify.org/spec/cloudify/4.5.5/types.yaml'
node_templates:
Import_Project:
type: cloudify.nodes.WebServer
capabilities:
scalable:
properties:
default_instances: 1
interfaces:
cloudify.interfaces.lifecycle:
start:
implementation: scripts/create_project.sh
inputs: {}
The second with a direct mapping:
tosca_definitions_version: cloudify_dsl_1_3
imports:
- 'http://www.getcloudify.org/spec/cloudify/4.5.5/types.yaml'
node_templates:
Import_Project:
type: cloudify.nodes.WebServer
capabilities:
scalable:
properties:
default_instances: 1
interfaces:
cloudify.interfaces.lifecycle:
start:
implementation: script.script_runner.tasks.run
inputs:
script_path: scripts/create_project.sh
I've created a directory named scripts and placed the below create_project.sh script in this directory:
#! /bin/bash -e
ctx logger info "Hello to this world"
hostname
I'm getting errors while validating the blueprint.
Error when operation is mapped directly to a script:
[2019-04-13 13:29:40.594] [DEBUG] DslParserExecClient - got output from dsl parser Could not extract plugin from operation mapping 'scripts/create_project.sh', which is declared for operation 'start'. In interface 'cloudify.interfaces.lifecycle' in node 'Import_Project' of type 'cloudify.nodes.WebServer'
in: /opt/cloudify-composer/backend/dev/workspace/2/tmp-27O0e1t813N6as
in line: 3, column: 2
path: node_templates.Import_Project
value: {'interfaces': {'cloudify.interfaces.lifecycle': {'start': {'implementation': 'scripts/create_project.sh', 'inputs': {}}}}, 'type': 'cloudify.nodes.WebServer', 'capabilities': {'scalable': {'properties': {'default_instances': 1}}}}
Error when using a direct mapping:
[2019-04-13 13:25:21.015] [DEBUG] DslParserExecClient - got output from dsl parser node 'Import_Project' has no relationship which makes it contained within a host and it has a plugin 'script' with 'host_agent' as an executor. These types of plugins must be installed on a host
in: /opt/cloudify-composer/backend/dev/workspace/2/tmp-279QCz2CV3Y81L
in line: 2, column: 0
path: node_templates
value: {'Import_Project': {'interfaces': {'cloudify.interfaces.lifecycle': {'start': {'implementation': 'script.script_runner.tasks.run', 'inputs': {'script_path': 'scripts/create_project.sh'}}}}, 'type': 'cloudify.nodes.WebServer', 'capabilities': {'scalable': {'properties': {'default_instances': 1}}}}}
What is missing to make this work?
I also found the Cloudify Script Plugin examples from their documentation do not work: https://docs.cloudify.co/4.6/working_with/official_plugins/configuration/script/
However, I found I can make the examples work by adding an executor line in parallel with the implementation line to override the host_agent executor as follows:
tosca_definitions_version: cloudify_dsl_1_3
imports:
- 'http://www.getcloudify.org/spec/cloudify/4.5.5/types.yaml'
node_templates:
Import_Project:
type: cloudify.nodes.WebServer
capabilities:
scalable:
properties:
default_instances: 1
interfaces:
cloudify.interfaces.lifecycle:
start:
implementation: scripts/create_project.sh
executor: central_deployment_agent
inputs: {}

How to find in an ansible (yaml) dictionary a key from his value?

I have this dictionary:
MyClouds:
Devwatt:
ExternalNetwork: PublicRSC
Flavors:
- Flavor_1cpu_1gb: Devwatt_1cpu_1gb
- Flavor_1cpu_2gb: Devwatt_1cpu_2gb
- Flavor_1cpu_4gb: Devwatt_1cpu_4gb
Fuga:
ExternalNetwork: Internet
Flavors:
- Flavor_1cpu_1gb: Fuga_1cpu_1gb
- Flavor_1cpu_2gb: Fuga_1cpu_2gb
- Flavor_1cpu_4gb: Fuga_1cpu_4gb
- Flavor_1cpu_8gb: Fuga_1cpu_8gb
I have to migrate from one Openstack cloud to another, and one of my problem is to find correspondances between flavors.
I want to find which flavor (key) has the value "Devwatt_1cpu_2gb" in "Devwatt", and after get the value of the same key in "Fuga"
I tried a lot of solution (with-dict, when, jija filters, json_query) but I can't find a way to do that.
Please, may you help me ?
Inspired by Eric's answer and this usefull resource, I, finally, used this solution:
I changed a little bit my data structure and put it in a file matrice.yml:
MyClouds:
Devwatt:
ExternalNetwork: PublicRSC
Flavors:
- name: Flavor_1cpu_1gb
FlavorName: Devwatt_1cpu_1gb
- name: Flavor_2cpu_1gb
FlavorName: Devwatt_2cpu_1gb
- name: Flavor_1cpu_2gb
FlavorName: Devwatt_1cpu_2gb
Fuga:
ExternalNetwork: Internet
Flavors:
- name: Flavor_1cpu_1gb
FlavorName: Fuga_1cpu_1gb
- name: Flavor_2cpu_1gb
FlavorName: Fuga_2cpu_1gb
- name: Flavor_1cpu_2gb
FlavorName: Fuga_1cpu_2gb
then I used these filters in my playbook:
---
- hosts: localhost
connection: local
gather_facts: false
vars:
SourceFlavorName: "Devwatt_2cpu_1gb"
tasks:
- name: get flavors matrice
include_vars:
file: matrice.yml
- name: Get generic name from flavor name of source cloud
debug:
msg: "{{ MyClouds.Devwatt.Flavors | selectattr('FlavorName','search','^'+ SourceFlavorName +'$') |map (attribute='name') | list }}"
register: result
- name: Get flavor name for target cloud from generic name
debug:
msg: "{{ MyClouds.Fuga.Flavors | selectattr('name','search','^'+ result.msg[0] +'$') |map (attribute='FlavorName') | list }}"
With this solution I can have any number of clouds and find easily correspondances between flavor from one source cloud to target cloud.
Why not using a simple mapping using a dict where keys are "Devwatt" flavors and values are "Fuga" flavors, like this :
---
- hosts: localhost
vars:
FlavorsMapping:
Devwatt_1cpu_1gb: Fuga_1cpu_1gb
Devwatt_1cpu_2gb: Fuga_1cpu_2gb
Devwatt_1cpu_4gb: Fuga_1cpu_4gb
tasks:
- debug:
var: FlavorsMapping['Devwatt_1cpu_2gb']

How to use variable values in salt formula?

Consider my SLS file,
state1:
cmd.run:
- order: 1
- name: |
USER_NAME='username'
USERPWD='password'
DB_NAME='test'
USER_TOBE_CREATED='new_user'
PASSWORD='newpass'
mysql_user.present:
- order: 2
- host: localhost
- username: USER_TOBE_CREATED
- password: PASSWORD
- connection_user: USER_NAME
- connection_pass: USERPWD
- connection_charset: utf8
- saltenv:
- LC_ALL: "en_US.utf8"
mysql_grants.present:
- order: 3
- grant: all privileges
- database: DB_NAME.*
- user: USER_TOBE_CREATED
In the states mysql_user.present and mysql_grants.present I am using the variables USER_TOBE_CREATED,USER_NAME,USERPWD etc whose values are assigned in state cmd.run. How will I make these two following states to use the actual values of those variables?. Here it's taking variable name itself as the value.
You may want to declare the variables in the state file itself, i.e.:
{% set user_name = "name" %}
and:
state1:
cmd.run:
- order: 1
- name: |
USER_NAME='{{ user_name }}'
You can re-use the variable as many times as you want inside the state file.
Let me know if this helped.

Resources