In Terraform, how can you easily create 10 alarms per table - amazon-dynamodb

A single terraform alarm looks something like this:
resource "aws_cloudwatch_metric_alarm" "ecs_cpu_reservation" {
alarm_name = "ecs-cpu-reservation-${var.environment}"
alarm_description = "my description"
namespace = "AWS/ECS"
metric_name = "CPUReservation"
dimensions {
ClusterName = "${var.environment}"
}
statistic = "Average"
period = "300"
evaluation_periods = "${var.acceptable_cpu_reservation_eval_period}"
comparison_operator = "GreaterThanOrEqualToThreshold"
threshold = "${var.acceptable_cpu_reservation}"
alarm_actions = ["${data.terraform_remote_state.vpc.my_topic_arn}"]
actions_enabled = "${var.alerting_enabled}"
}
I have 10 alarms per table, and 50 tables.
Therefore, the tf file will contain 500 of those resource blocks. That's a huge file!
The vast majority of the alarms are identical... the only difference being what table the alarm is for.
Is there a way to loop over a table name list and create the alarms?
From what I read, using the "count" variable (or iterating over a list) will lead to maintenance nightmares.

One other option to avoid looping would be to wrap things with a module so you need to source that just once to get everything inside the module.
So if you had a module that looked like this:
variable "dynamodb_table_name" {}
variable "consumed_read_units_threshold" {}
variable "consumed_write_units_threshold" {}
...
resource "aws_cloudwatch_metric_alarm" "consumed_read_units" {
alarm_name = "dynamodb_${var.dynamodb_table_name}_consumed_read_units"
comparison_operator = "GreaterThanOrEqualToThreshold"
evaluation_periods = "2"
metric_name = "ConsumedReadCapacityUnits"
namespace = "AWS/DynamoDB"
period = "120"
statistic = "Average"
dimensions {
TableName = "${var.dynamodb_table_name}"
}
threshold = "${var.consumed_read_units_threshold}"
alarm_description = "This metric monitors DynamoDB ConsumedReadCapacityUnits for ${var.dynamodb_table_name}"
insufficient_data_actions = []
}
resource "aws_cloudwatch_metric_alarm" "consumed_write_units" {
alarm_name = "dynamodb_${var.dynamodb_table_name}_consumed_write_units"
comparison_operator = "GreaterThanOrEqualToThreshold"
evaluation_periods = "2"
metric_name = "ConsumedWriteCapacityUnits"
namespace = "AWS/DynamoDB"
period = "120"
statistic = "Average"
dimensions {
TableName = "${var.dynamodb_table_name}"
}
threshold = "${var.consumed_write_units_threshold}"
alarm_description = "This metric monitors DynamoDB ConsumedWriteCapacityUnits for ${var.dynamodb_table_name}"
insufficient_data_actions = []
}
...
You could define all of your DynamoDB table metrics in a single place and then source that whenever you create your DynamoDB table and they would all get metric alarms for every thing you care about. It's then easy to add/remove/modify these alarms in a single place (the module) and have that automatically applied to your tables on the next terraform apply.
Ideally you'd be able to create a single DynamoDB module that created the table as well as the alarms at the same time but the DynamoDB table resource isn't particularly flexible so it would be a nightmare to design a module that would allow for the flexibility you need (different defined attributes and indexes mostly).
If you wanted you could combine this with looping to reduce some of the module code in exchange for issues around changing the looped things because Terraform will see the resources for things changing and force recreations of even things that shouldn't be affected. In the case where you are just creating alarms (nothing stateful or needs to be up 100%) this isn't a huge issue but be aware that it could require a second terraform apply to fully apply changes done like that.

Related

How do I get a value from a dictionary when the key is a value in another dictionary in Lua?

I am writing some code where I have multiple dictionaries for my data. The reason being, I have multiple core objects and multiple smaller assets and the user must be able to choose a smaller asset and have some function off in the distance run the code with the parent noted.
An example of one of the dictionaries: (I'm working in ROBLOX Lua 5.1 but the syntax for the problem should be identical)
local data = {
character = workspace.Stores.NPCs.Thom,
name = "Thom", npcId = 9,
npcDialog = workspace.Stores.NPCs.Thom.Dialog
}
local items = {
item1 = {
model = workspace.Stores.Items.Item1.Main,
npcName = "Thom",
}
}
This is my function:
local function function1(item)
if not items[item] and data[items[item[npcName]]] then return false end
end
As you can see, I try to index the dictionary using a key from another dictionary. Usually this is no problem.
local thisIsAVariable = item[item1[npcName]]
but the method I use above tries to index the data dictionary for data that is in the items dictionary.
Without a ton of local variables and clutter, is there a way to do this? I had an idea to wrap the conflicting dictionary reference in a tostring() function to separate them - would that work?
Thank you.
As I see it, your issue is that:
data[items[item[npcName]]]
is looking for data[“Thom”] ... but you do not have such a key in the data table. You have a “name” key that has a “Thom” value. You could reverse the name key and value in the data table. “Thom” = name

How to use recursive function in Lua to dynamically build a table without overwriting it each call

I have a need to crawl a set of data of indeterminate size and build a table key/value index of it. Since I don't know the dimensions in advance, seems I have to use a recursive function. My Lua skills are very new and superficial. I'm having difficulty understanding how to deal with returning a table from the function call.
Note this is for a Lua 5.1 script processor
API = function(tbl)
local table_api = {}
-- do stuff here with target data and add to table_api
table_api["key"] = value
-- then later there is a need to recurse deeper into target data
table_api["lower"] = API(var)
return table_api
end
result = API(source_data)
In most every other language I know there would be some way to make the recurse line table_api["lower"] = API(var) work but since Lua does table variables by reference, my return sub-table just keeps getting overwritten and my result is a tiny last bit of what it should be.
Just for background on my purpose: there's a commercial application I'm working with that has a weakly documented Lua scripting interface. It's running Lua 5.1 and the API is not well documented and frequently updated. As I understand it, everything is stored in _G, so I wanted to write something to reverse engineer the API. I have a working recursive function (not shown here) that enumerates all of _G. For that return value, it just builds up an annotated string and progressively builds on the string. That all works fine and is really useful, but it shows much more that the API interface; all the actual data elements are included, so I have to sift through like 30,000 records to determine an API set of about 500 terms. In order to determine the API, I am trying to use this sub-table return value recursive function being discussed in this question. The code I'm showing here is just a small distilled subset of the larger function.
I'll go ahead and include the full code here. I was hoping to incrementally build a large'ish table of each API level, any sublevels, and finally whatever keys used at the lowest level.
In the end, I was expecting to have a table that I could address like this:
result["api"]["label"]["api"]["sublabel"]["value"]["valuename"]
Full code:
tableAPIShow = function(tbl, table_track)
table_track = table_track or {}
local table_api = {}
if type(tbl) == 'table' then
-- Check if values are tables.
local parent_table_flag = true
for ind,val in pairs(tbl) do
if type(val) ~= 'table' then
parent_table_flag = false
break
end
end
-- If all children are table type, check each of them for subordinate commonality
local api_flag = false
if parent_table_flag == true then
local child_table = {}
local child_table_flag = false
api_flag = true
for ind,val in pairs(tbl) do
-- For each child table, store the names of the indexes.
for sub_ind,sub_val in pairs(val) do
if child_table_flag == false then -- First time though, create starting template view of typical child table.
child_table[sub_ind] = true -- Store the indexes as a template table.
elseif child_table[sub_ind] == nil then -- Otherwise, test this child table compared to the reference template.
api_flag = false
break
end
end
if api_flag == false then -- need to break out of nested loop
break
end
child_table_flag = true
end
end
if api_flag == true then
-- If everything gets to here, then this level is an API with matching child tables below.
for ind,val in pairs(tbl) do
if table_api["api"] == nil then
table_api["api"] = {}
end
table_api["api"][ind] = tableAPIShow(val, table_track)
end
else
-- This level is not an API level, determine how to process otherwise.
for ind,val in pairs(tbl) do
if type(val) == 'table' then
if table_track[val] ~= nil then -- Have we already recursed this table?
else
table_track[val] = true
if table_api["table"] == nil then
table_api["table"] = {}
end
table_api["table"][ind] = tableAPIShow(val, table_track)
end
else -- The children are not tables, they are values
if table_api["value"] == nil then
table_api["value"] = {}
end
table_api["value"][ind] = val
end
end
end
else
-- It's not a table, just return it.
-- Probably never use this portion because it's caught on upper level recurse and not called
return tbl
end
return table_api
end
And I was calling this function in the main script like this:
local str = tableAPIShow(_G)
I've got another function that recursively shows a table so I can look inside my results and see I only get a return value that contains only the values of the top-level of _G (I have it excluded built-in Lua functions/values because I'm only interested in the Application API):
{
[value] = table: 00000000F22CB700 {
[value][_VERSION] = Application/5.8.1 (x86_64; Windows NT 10.0.16299),
[value][tableAPIShow] = "function: 00000000F22C6DE0, defined in (121-231) C:\\Users\\user\\AppData\\Local\\Temp\\APP\\/~mis00002690 ",
[value][_FINAL_VERSION] = true,
[value][Path] = ./Scripts/Database/elements/,
[value][class] = "function: 00000000F1953C40, defined in (68-81) Scripts/Common/Class.lua ",
[value][db_path] = ./Scripts/Database/,
[value][merge_all_units] = "function: 00000000F20D20C8, defined in (2242-2250) Scripts/Database/db_merge.lua ",
}
You just need to localize the variable you store your table in and it will work as you expect:
local table_api = {}
(note that you are passing table variable that conflicts with the global table variable and is not currently used in the function.)
I am inclined to believe that your tableAPIShow function code is working correctly and the
another function that recursively shows a table
fails to serialize your table fully. Hence you don't see the deeper levels of the table returned by tableAPIShow().
I got your initial and current code (tableAPIShow) to work with my simple table serialize function: the whole _Global table is exported completely and formatted as you implemented it in your tableAPIShow().
Code for testing:
apiMassiveTable = {
api = {
get = {
profile = {"profileID", "format"},
name = {"profileID", "encoding"},
number = {"profileID", "binary"}
},
set = {
name = {"apikey", "profileID", "encoding", "newname"},
number = {"apikey", "profileID", "binary", "newnumber"}
},
retrieve = {}
},
metadata = {version="1.4.2", build="nightly"}
}
-- tableAPIShow implemenation here
table.serialize = dofile("serialize.lua")
print(table.serialize("myNameForAHugeTable", tableAPIShow(_G)))
PS: Whatever serialize function you're using, it should enquote strings like Application/5.8.1 (x86_64; Windows NT 10.0.16299) which it does not.
Like #Paul Kulchenko said, you need to learn to use locals (https://www.lua.org/pil/4.2.html). Global variable post-exist until a new lua_State is loaded (a new environment, could be a new process depending on what interpreter you are using). So a tip is to always use local variables for anything you don't want to leave the function or leave the compilation unit.
Think of tables like dictionaries: a word is attached to a definition. Thusly the definition is the data.
What I think you are trying to do is serialization of a table of data. However, that isn't really necessary. You can either shadow copy or deep copy the given table. A shadow copy is when you don't delve into the depths of tables found in the keys, etc. A deep copy is when you copy tables in keys of tables in keys of tables... etc.
local shallow_copy = function(tab)
local rep_tab = {}
for index, value in pairs(tab)do
rep_tab[index] = value
end
return rep_tab
end
-- because local variable is not defined or declared immediately on a 1-liner,
-- a declaration has to exist so that deep_copy can be used
-- lets metatable execute functions
local deep_copy
deep_copy = function(tab)
local rep_tab = {}
for index, value in pairs(tab)do
if(type(value) == "table")then
rep_tab[index] = deep_copy(value)
else
rep_tab[index] = value
end
end
return rep_tab
end
Deco's deepcopy.lua
https://gist.github.com/Deco/3985043
You can also index tables using periods:
local tab = {}
tab.abc = 123
tab["def"] = 456
print(tab.abc, tab["def"])
To serialize the entire _G, you would just filter out the junk you don't need and recurse each table encountered. Watch out for _G, package, and _ENV because if defined it will recurse back to the start.
-- use cap as a whitelist
enumerate_dir = function(dir, cap)
local base = {}
for i, v in pairs(dir) do
-- skip trouble
if(i ~= "_G" and i ~= "_ENV" and i ~= "package")then
if(type(v) == "table")then -- if we have a table
base[i] = enumerate_dir(v, cap)
else
for k, n in pairs(cap) do
if(type(v) == n)then -- if whitelisted
base[i] = tostring(v)
end
end
end
end
end
return base
end
-- only functions and tables from _G please
local enumeration = enumerate_dir(_G, {"function", "table"})
-- do some cool quips to get a basic tree system
prefix = ""
recursive_print = function(dir)
for i, v in pairs(dir)do
if(type(v) == "table")then
print(prefix, i, v)
prefix = prefix .. ">"
recursive_print(v)
else
print(prefix, i, v)
end
end
if(#prefix > 0)then
prefix = prefix:sub(1, #prefix-1)
end
end
recursive_print(test)

Losing some z3c relation data on restart

I have the following code which is meant to programmatically assign relation values to a custom content type.
publications = # some data
catalog = getToolByName(context, 'portal_catalog')
for pub in publications:
if pub['custom_id']:
results = catalog(custom_id=pub['custom_id'])
if len(results) == 1:
obj = results[0].getObject()
measures = []
for m in pub['measure']:
if m in context.objectIds():
m_id = intids.getId(context[m])
relation = RelationValue(m_id)
measures.append(relation)
obj.measures = measures
obj.reindexObject()
notify(ObjectModifiedEvent(obj))
Snippet of schema for custom content type
measures = RelationList(
title=_(u'Measure(s)'),
required=False,
value_type=RelationChoice(title=_(u'Measure'),
source=ObjPathSourceBinder(object_provides='foo.bar.interfaces.measure.IMeasure')),
)
When I run my script everything looks good. The problem is when my template for the custom content tries to call "pub/from_object/absolute_url" the value is blank - only after a restart. Interestingly, I can get other attributes of pub/from_object after a restart, just not it's URL.
from_object retrieves the referencing object from the relation catalog, but doesn't put the object back in its proper Acquisition chain. See http://docs.plone.org/external/plone.app.dexterity/docs/advanced/references.html#back-references for a way to do it that should work.

What's wrong with my filter query to figure out if a key is a member of a list(db.key) property?

I'm having trouble retrieving a filtered list from google app engine datastore (using python for server side). My data entity is defined as the following
class Course_Table(db.Model):
course_name = db.StringProperty(required=True, indexed=True)
....
head_tags_1=db.ListProperty(db.Key)
So the head_tags_1 property is a list of keys (which are the keys to a different entity called Headings_1).
I'm in the Handler below to spin through my Course_Table entity to filter the courses that have a particular Headings_1 key as a member of the head_tags_1 property. However, it doesn't seem like it is retrieving anything when I know there is data there to fulfill the request since it never displays the logs below when I go back to iterate through the results of my query (below). Any ideas of what I'm doing wrong?
def get(self,level_num,h_key):
path = []
if level_num == "1":
q = Course_Table.all().filter("head_tags_1 =", h_key)
for each in q:
logging.info('going through courses with this heading name')
logging.info("course name filtered is %s ", each.course_name)
MANY MANY THANK YOUS
I assume h_key is key of headings_1, since head_tags_1 is a list, I believe what you need is IN operator. https://developers.google.com/appengine/docs/python/datastore/queries
Note: your indentation inside the for loop does not seem correct.
My bad apparently '=' for list is already check membership. Using = to check membership is working for me, can you make sure h_key is really a datastore key class?
Here is my example, the first get produces result, where the 2nd one is not
import webapp2 from google.appengine.ext import db
class Greeting(db.Model):
author = db.StringProperty()
x = db.ListProperty(db.Key)
class C(db.Model): name = db.StringProperty()
class MainPage(webapp2.RequestHandler):
def get(self):
ckey = db.Key.from_path('C', 'abc')
dkey = db.Key.from_path('C', 'def')
ekey = db.Key.from_path('C', 'ghi')
Greeting(author='xxx', x=[ckey, dkey]).put()
x = Greeting.all().filter('x =',ckey).get()
self.response.write(x and x.author or 'None')
x = Greeting.all().filter('x =',ekey).get()
self.response.write(x and x.author or 'None')
app = webapp2.WSGIApplication([('/', MainPage)],
debug=True)

sort collection items by getObjPositionInParent

I would like to have getObjPositionInParent as a sort criteria in a collection. I configured it as "available" in the site setup for collection views. But it is not available. Did I forget something?
You didn't forget anything, but found a bug in Plone. The GopipIndex from plone.app.folder is used for the getObjPositionInParent index. But this index type is not registered for any collection criteria. The criterion registry in Products.ATContentTypes.criteria needs to include a mapping for the GopipIndex. Likely adding it to the SORT_INDICES list would be the right thing to do. To do this from outside, you can do something like:
# Make sort criteria available for the GopipIndex
from Products.ATContentTypes.criteria import _criterionRegistry
crit_reg = _criterionRegistry
crit_id = 'ATSortCriterion'
index = 'GopipIndex'
indices = crit_reg.criterion2index.get(crit_id, ())
crit_reg.criterion2index[crit_id] = indices + (index, )
value = crit_reg.index2criterion.get(index, ())
crit_reg.index2criterion[index] = value + (crit_id, )

Resources