How to Append Data to existing Yaml - pyyaml

Here is my YAML:
---
version: 1.0
employee:
name:
employee1
headers:
I have the dict_data as below:-
dict_data = {'Key1': 'value1', 'Key2': 'value2}
Now we have to append this dict to headers attribute in yaml.
And the yaml should look like below:-
---
version: 1.0
employee:
name:
employee1
headers:
Key1: value1
Key2: value2
I tried doing this with PyYaml using update keyword and dumped only the dict in yaml file and removed all other contents.

You can read, update and rewrite as,
with open('filename.yaml','r') as yamlfile:
current_yaml = yaml.safe_load(yamlfile)
current_yaml['headers'].update(dict_data)
if current_yaml:
with open('filename.yaml','w') as yamlfile:
yaml.safe_dump(current_yaml, yamlfile)

Related

Command "generate-types" is not defined

I'm trying to generate types from the schema.yaml file, which I created
schema.yaml
types:
product:
properties:
name: {nullable: false}
description: {nullable: false}
image: { range: "Text"}
Offer:
properties:
url: {nullable: false}
price: {nullable: false, range: "Number"}
priceCurrency: {nullable: false}
Then when I execute the following command
vendor/bin/schema generate-types src/ config/schema.yaml
I will receive the following error:
Command "generate-types" is not defined
I tried to know what is the reason but I couldn't find it out.
Where can I defined the command: generate-types
My smfony version is 5.3
thanks a lot
See this changelog:
The generate-types command has been renamed generate
You need to change generate-types to generate.

How to use the serverless environment variable in stepfunction parameter

I have a query with hardcoded dates used in the parameters section.Instead I want to pass them as environment variables.Any suggestions on how to parameterize the QueryString parameter?
service: service-name
frameworkVersion: '2'
provider:
name: aws
runtime: go1.x
lambdaHashingVersion: 20201221
stage: ${opt:stage, self:custom.defaultStage}
region: us-east-1
tags: ${self:custom.tagsObject}
logRetentionInDays: 1
timeout: 10
deploymentBucket: lambda-repository
memorySize: 128
tracing:
lambda: true
plugins:
- serverless-step-functions
configValidationMode: error
stepFunctions:
stateMachines:
callAthena:
name: datasorting-dev
type: STANDARD
role: ${self:custom.datasorting.${self:provider.stage}.iam}
definition:
Comment: "Data Refersh"
StartAt: Refresh Data
States:
Refresh Data:
Type: Task
Resource: arn:aws:states:::athena:startQueryExecution.sync
Parameters:
QueryString: >-
ALTER TABLE table.raw_data ADD IF NOT EXISTS
PARTITION (YEAR=2021, MONTH=02, DAY=15, hour=00)
WorkGroup: primary
ResultConfiguration:
OutputLocation: s3://output/location
End: true
you can replace any value in your serverless.yml enclosed in ${} brackets,
Serverless Framework Guide to Variables:
https://www.serverless.com/framework/docs/providers/aws/guide/variables/
for example, you can create a custom: section looking for environment variables, and if they are not present, you can have default values:
service: service-name
frameworkVersion: '2'
custom:
year: ${env:YEAR, 'default-year'}
month: ${env:MONTH, 'default-month'}
day: ${env:DAY, 'default-day'}
hour: ${env:HOUR, 'default-hour'}
stepFunctions:
stateMachines:
callAthena:
...
Parameters:
QueryString: >-
ALTER TABLE table.raw_data ADD IF NOT EXISTS
PARTITION (YEAR=${self:custom.year}, MONTH=${self:custom.month}, DAY=${self:custom.day}, hour=${self:custom.hour})
...

Not able to execute lifecycle operation using script plugin

I'm trying to learn how to use script plugin. I'm following script plugin docs here but not able to make it work.
I've tried to use the plugin in two ways. The first, when cloudify.interface.lifecycle.start operation is mapped directly to a script:
tosca_definitions_version: cloudify_dsl_1_3
imports:
- 'http://www.getcloudify.org/spec/cloudify/4.5.5/types.yaml'
node_templates:
Import_Project:
type: cloudify.nodes.WebServer
capabilities:
scalable:
properties:
default_instances: 1
interfaces:
cloudify.interfaces.lifecycle:
start:
implementation: scripts/create_project.sh
inputs: {}
The second with a direct mapping:
tosca_definitions_version: cloudify_dsl_1_3
imports:
- 'http://www.getcloudify.org/spec/cloudify/4.5.5/types.yaml'
node_templates:
Import_Project:
type: cloudify.nodes.WebServer
capabilities:
scalable:
properties:
default_instances: 1
interfaces:
cloudify.interfaces.lifecycle:
start:
implementation: script.script_runner.tasks.run
inputs:
script_path: scripts/create_project.sh
I've created a directory named scripts and placed the below create_project.sh script in this directory:
#! /bin/bash -e
ctx logger info "Hello to this world"
hostname
I'm getting errors while validating the blueprint.
Error when operation is mapped directly to a script:
[2019-04-13 13:29:40.594] [DEBUG] DslParserExecClient - got output from dsl parser Could not extract plugin from operation mapping 'scripts/create_project.sh', which is declared for operation 'start'. In interface 'cloudify.interfaces.lifecycle' in node 'Import_Project' of type 'cloudify.nodes.WebServer'
in: /opt/cloudify-composer/backend/dev/workspace/2/tmp-27O0e1t813N6as
in line: 3, column: 2
path: node_templates.Import_Project
value: {'interfaces': {'cloudify.interfaces.lifecycle': {'start': {'implementation': 'scripts/create_project.sh', 'inputs': {}}}}, 'type': 'cloudify.nodes.WebServer', 'capabilities': {'scalable': {'properties': {'default_instances': 1}}}}
Error when using a direct mapping:
[2019-04-13 13:25:21.015] [DEBUG] DslParserExecClient - got output from dsl parser node 'Import_Project' has no relationship which makes it contained within a host and it has a plugin 'script' with 'host_agent' as an executor. These types of plugins must be installed on a host
in: /opt/cloudify-composer/backend/dev/workspace/2/tmp-279QCz2CV3Y81L
in line: 2, column: 0
path: node_templates
value: {'Import_Project': {'interfaces': {'cloudify.interfaces.lifecycle': {'start': {'implementation': 'script.script_runner.tasks.run', 'inputs': {'script_path': 'scripts/create_project.sh'}}}}, 'type': 'cloudify.nodes.WebServer', 'capabilities': {'scalable': {'properties': {'default_instances': 1}}}}}
What is missing to make this work?
I also found the Cloudify Script Plugin examples from their documentation do not work: https://docs.cloudify.co/4.6/working_with/official_plugins/configuration/script/
However, I found I can make the examples work by adding an executor line in parallel with the implementation line to override the host_agent executor as follows:
tosca_definitions_version: cloudify_dsl_1_3
imports:
- 'http://www.getcloudify.org/spec/cloudify/4.5.5/types.yaml'
node_templates:
Import_Project:
type: cloudify.nodes.WebServer
capabilities:
scalable:
properties:
default_instances: 1
interfaces:
cloudify.interfaces.lifecycle:
start:
implementation: scripts/create_project.sh
executor: central_deployment_agent
inputs: {}

412 no matching index found while executing a query in cloud datastore

I am using gcloud-python library for querying data from the cloud datastore. Consider my snippet to be like this
from google.appengine.ext import ndb
from datetime import datetime
class Order(ndb.Model):
order_name = ndb.StringProperty(required=True)
date_created = ndb.DateTimeProperty(default= datetime.now())
#code for querying the cloud datastore
from gcloud.datastore.query import Query
date_start = datetime.combine(date(year=2015, month=08, day=01), time())
date_end = datetime.combine(date(year=2015, month=08, day=03), time())
query = Query(kind='Order')
query.add_filter('order_name', '=', 'grand-line-order')
query.add_filter('date_created', '<', date_end)
query.add_filter('date_created', '>', date_start)
iterator = query.fetch(limit=10)
records, more, cursor = iterator.next_page()
print records
For the above snippet i am getting
File "/Users/sathyanarrayanan/Desktop/app/services/cdr_helper.py", line 528, in fetch_cdr
records, more, cursor = iterator.next_page()
File "/Users/sathyanarrayanan/Desktop/app/gcloud/datastore/query.py", line 388, in next_page
transaction_id=transaction and transaction.id,
File "/Users/sathyanarrayanan/Desktop/app/gcloud/datastore/connection.py", line 257, in run_query
datastore_pb.RunQueryResponse)
File "/Users/sathyanarrayanan/Desktop/app/gcloud/datastore/connection.py", line 108, in _rpc
data=request_pb.SerializeToString())
File "/Users/sathyanarrayanan/Desktop/app/gcloud/datastore/connection.py", line 85, in _request
raise make_exception(headers, content, use_json=False)
PreconditionFailed: 412 no matching index found.
My Index.yaml file is like this.
indexes:
- kind: Order
ancestor: yes
properties:
- name: date_created
- kind: Order
ancestor: yes
properties:
- name: date_created
direction: desc
- kind: Order
ancestor: yes
properties:
- name: order_name
direction: asc
- name: date_created
direction: desc
- kind: Order
ancestor: yes
properties:
- name: order_name
direction: asc
- name: date_created
direction: asc
Am I doing something wrong? Please help me out.
All of your indexes using ancestor:yes so ancestor key should be added in your query. without ancestor your index configuration require another index with 'ancestor:no'
- kind: Order
ancestor: no
properties:
- name: order_name
direction: asc
- name: date_created
direction: desc
Note: specific indexes for each query
The index configuration docs indicate that the index configuration should be in an XML file called datastore-indexes.xml.

Import DateTime using Nemo/Alice bundle

My question is about this bundle: https://github.com/nelmio/alice in combination with Symfony2.
I have some fixture i want to load in my new website, and this bundle is great for that. I created some YML files and consider the following YML as my fixturedata:
DateTime (local):
news-date-1:
__construct: ['2014-07-01']
Stef\BVBundle\Entity\Blog:
StefBVBundle-Blog-1:
title: 'A day with blah'
blog: 'e5645646'
image: 'beach.jpg'
author: 'dsyph3r'
tags: 'symfony2, php, paradise, symblog'
created: #news-date-1
updated: #news-date-1
StefBVBundle-Blog-2:
id: 1
title: 'meeeh'
author: dsyph3r
blog: '5rw5425'
image: beach.jpg
tags: 'symfony2, php, paradise, symblog'
created: '2014-07-01T00:00:00+0200'
updated: '2014-07-01T00:00:00+0200'
The one labelled with 'StefBVBundle-Blog-1' works like a charm, it knows 'created' and 'updated' are \DateTime values.
But 'StefBVBundle-Blog-2' causes an error, because the Nemo/Alice bundle consider it as a string, instead of a DateTime. Is it possible to do the DateTime-part inline?
PHP Expressions inside <( )> are simply passed to Doctrine, so this will do the work :
Stef\BVBundle\Entity\Blog:
StefBVBundle-Blog-2:
created: <(new \DateTime('2014-02-02'))>
Regarding the doc of Faker library, you have to specify a DateTime instance, or DateTimeBetween with no time-laps if you want an exact date.
Your code, with correction:
DateTime (local):
news-date-1:
__construct: ['2014-07-01']
Stef\BVBundle\Entity\Blog:
StefBVBundle-Blog-1:
title: 'A day with blah'
blog: 'e5645646'
image: 'beach.jpg'
author: 'dsyph3r'
tags: 'symfony2, php, paradise, symblog'
created: #news-date-1
updated: #news-date-1
StefBVBundle-Blog-2:
id: 1
title: 'meeeh'
author: dsyph3r
blog: '5rw5425'
image: beach.jpg
tags: 'symfony2, php, paradise, symblog'
created: <dateTimeBetween('0 days', '2014-07-01T00:00:00+0200')>
updated: <dateTimeBetween('0 days', '2014-07-01T00:00:00+0200')>
Didn't try, but should work.

Resources