Looker Unit Testing value in developmentg against a value in development - looker

I would like to write a unit test that checks whether a value in the development environment matched the value in production in Looker. At the moment I can check values against a static hardcoded value, but not able to check dynamically
test: 5_test_impacted_engineers_with_new_eng_levels{
explore_source: fellow{
column:impacted_engineers{
field:fellow.count
}
filters: {
field: fellow.eng_level
value: "Eng1, Eng2, Eng3, Eng4, Eng5, Eng6, Eng7"
}
filters: {
field:fellow_promotion_dates.impacted_engineers
value: "yes"
}
}
##impacted engineers should not have had eng levels, yes?
assert: expected_impacted_new_levels {
expression: ${fellow.count}=0;;
}
}

Related

Using regex to negate a filter in wiremock jsonpath

I am using wiremock for stubbing and it uses Jayway JsonPath.
I want to create a stub only when the json element doesn't contain exactly 10 digits.
stub is
"request": {
"bodyPatterns": [
{
"matchingJsonPath": "$.employees[?(#.employeeContact =~/[^0-9]{10}/)]"
}
]
}
I have tried multiple combinations like:
1. $.employees[?(#.employeeContact =~/[^0-9]{10}/)]
2. $.employees[?(#.employeeContact =~/^[0-9]{10}/)]
3. $.employees[?(#.employeeContact !=~/[0-9]{10}/)]
4. $.employees[?(#.employeeContact <>~/[^0-9]{10}/)]
But none of these have worked.
Example json which should NOT work:
{
"employee": {
"employeeContact": "1234567890"
}
}
while these employee should work (anything other than 10 digits):
1. "employeeContact": "1a34567890" // character in between
2. "employeeContact": "12345678901" // more than 10
3. "employeeContact": "123456789" // less than 10
4. "employeeContact": "123456 89" //space
You could use the logical or operator to match for lengths less than 10 and greater than 10.
"bodyPatterns": [
"or": [
{ "matchingJsonPath": "$.employees[?(#.employeeContact =~/[^0-9]{1,9}/)]" },
{ "matchingJsonPath": "$.employees[?(#.employeeContact =~/[^0-9]{11,}/)]" }
]
]
This is what worked for me:
"bodyPatterns": [{
"matchesJsonPath": "$.employees[?(#.employeeContact =~/[^0-9]{1,9}/ || $.employees[?(#.employeeContact =~/[^0-9]{11,}/)]"
}]
Watch that it is matchesJsonPath instead of matchingJsonPath.
Even with that "or" didnt work with my wiremock 2.28.1 so may well be a wiremock bug.

Pact exact match of a field within an array

I am writing a pact test for a request that filters out the response data by a certain field, hense I would like to create a matcher that would match an array of objects with an exact match on that field.
I tried doing the following two aproaches:
body: eachLike({
color: 'red',
name: like('any'),
}),
body: eachLike({
color: extractPayload('red'),
name: like('any'),
}),
Bot both of them produce the same result:
"matchingRules": {
"$.body": {
"min": 1
},
"$.body[*].*": {
"match": "type"
},
"$.body[*].name": {
"match": "type"
}
}
It seems to me that having "$.body[*].*": {"match": "type"} in there negates the exact matching for the color field. Am I wrong in that assumption, or is there a correct approach that would resolve this issue?
Yes, the issue is that the type matching is cascading and is not being reset.
The equal matcher (V3 only) will reset the matching rule for this context.
It's available in the latest beta: https://github.com/pact-foundation/pact-js/tree/feat/v3.0.0#using-the-v3-matching-rules
To work it in v2, I would use the regex that matches a single string value here.

on conflict mutation gives unexpected result

on_conflict returns unknown argument
new to hasura, tried looking at multiple how to on_conflict, ran mutation from api explorer and from frontend, tried upsert_users (suggest me to change it to insert)
mutation upsert_users {
insert_users(
objects: [{
auth0_id: "iexistindb",
name: "somename"}
],
on_conflict: {
constraint: users_pkey,
update_columns: [last_seen, name]
}
) {
affected_rows
}
}
expected to update the user table if auth0 already exist
so i just encountered this now. i had the on_conflict / update_columns but hadn't given update permissions to the role, only insert

AppSync BatchDeleteItem not executes properly

I'm working on a React Native application with AppSync, and following is my schema to the problem:
type JoineeDeletedConnection {
items: [Joinee]
nextToken: String
}
type Mutation {
deleteJoinee(ids: [ID!]): [Joinee]
}
In 'request mapping template' to resolver to deleteJoinee, I have following (following the tutorial from https://docs.aws.amazon.com/appsync/latest/devguide/tutorial-dynamodb-batch.html):
#set($ids = [])
#foreach($id in ${ctx.args.ids})
#set($map = {})
$util.qr($map.put("id", $util.dynamodb.toString($id)))
$util.qr($ids.add($map))
#end
{
"version" : "2018-05-29",
"operation" : "BatchDeleteItem",
"tables" : {
"JoineesTable": $util.toJson($ids)
}
}
..and in 'response mapping template' to the resolver,
$util.toJson($ctx.result.data.JoineesTable)
The problem is, when I ran the query, I got empty result and nothing deleted to database as well:
// calling the query
mutation DeleteJoinee {
deleteJoinee(ids: ["xxxx", "xxxx"])
{
id
}
}
// returns
{
"data": {
"deleteJoinee": [
null
]
}
}
I finally able to solve this puzzle, thanks to the answer mentioned here to point me to some direction.
Although, I noticed that JoineesTable does have trusted entity/role to the IAM 'Roles' section, yet it wasn't working for some reason. Looking into this more, I noticed that the existing policy had following actions as default:
"Action": [
"dynamodb:DeleteItem",
"dynamodb:GetItem",
"dynamodb:PutItem",
"dynamodb:Query",
"dynamodb:Scan",
"dynamodb:UpdateItem"
]
Once I added following two more actions to the list, things have started working:
"dynamodb:BatchWriteItem",
"dynamodb:BatchGetItem"
Thanks to #Vasileios Lekakis and #Ionut Trestian on this appSync quest )

Storing User Data Flattened Security

I'd like to create an app that has a lot of user data. Let's say that each user tracks their own time per task. If I were to store it flattened it would look like this:
{
users: {
USER_ID_1: {
name: 'Mat',
tasks: {
TASK_ID_1: true,
TASK_ID_2: true,
...
}
},
},
tasks: {
TASK_ID_1: {
start: 0,
end: 1
},
TASK_ID_2: {
start: 1,
end: 2
}
}
}
Now I'd like to query and get all the task information for the user. Right now the data is small. From their guides: https://www.firebase.com/docs/web/guide/structuring-data.html it says (near the end) "... Until we get into tens of thousands of records..." and then doesn't explain how to handle that.
So my question is as follows. I know we can't do filtering via security, but can I use security to limit what people have access to and then when searching base it off the user id? My structure would then turn to this:
{
users: {
USER_ID_1: {
name: 'Mat'
}
},
tasks: {
TASK_ID_1: {
user: USER_ID_1,
start: 0,
end: 1
},
TASK_ID_2: {
user: USER_ID_1,
start: 1,
end: 2
},
...
}
}
Then I would set up my security rules to only allow each task to be accessed by the user who created it, and my ref query would look like this:
var ref = new Firebase("https://MY_FIREBASE.firebaseio.com/");
$scope.tasks = $firebaseArray(ref.child('tasks/')
.orderByChild('user')
.startAt('USER_ID_1')
.endAt('USER_ID_1'));
Is that how I should structure it? My query works but I'm unsure if it'll work once I introduce security where one user can't see another users tasks.
You've already read that security rules can not be used to filter data. Not even creative data modeling can change that. :-)
To properly secure access to your tasks you'll need something like:
"tasks": {
"$taskid": {
".read": "auth.uid === data.child(user).val()"
}
}
With this each user can only read their own tasks.
But with these rules, your query won't work. At it's most core your query is reading from tasks here:
ref.child('tasks/')...some-filtering...on(...
And since your user does not have read permission on tasks this read operation fails.
If you'd give the user read permission on tasks the read and query would work, but the user could then also read all tasks that you don't want to give them access to.

Resources