I am looping over array of s3 buckets in a json file.
[
[
"response": {
"Buckets: : [
{"Name": "foo"},
{"Name": "bar"}
]
}
]
]
I want to loop over each bucket, call the aws s3 api to get the region for each bucket and append the {"region": "region_name"} for each object inside Buckets array and persist the changes to file.
I am struggling to write the modified data to file as such that it doesn't lose all the data.
Below script writes to a temp.json file but it keeps overwriting the data for each run. In the end I only see last element in the Bucket array written to temp.json file
I want only the region key added to each element but keep all the contents of file same.
jq -r '.[][0].response.Buckets[].Name' $REPORT/s3.json |
while IFS= read -r bucket
do
region=$(aws s3api get-bucket-location --bucket $bucket | jq -r '.LocationConstraint')
jq -r --arg bucket "$bucket" --arg region "$region" '.[][0].response.Buckets[] | select(.Name==$bucket) | .region=$region' $REPORT/s3.json | sponge $REPORT/temp.json
done
Keep the context by adding parentheses to the LHS as in (.[][0].response.Buckets[] | select(.Name==$bucket)).region=$region.
jq -r '.[][0].response.Buckets[].Name' $REPORT/s3.json |
while IFS= read -r bucket
do
region=$(aws s3api get-bucket-location --bucket $bucket | jq -r '.LocationConstraint')
jq -r --arg bucket "$bucket" --arg region "$region" '(.[][0].response.Buckets[] | select(.Name==$bucket)).region=$region' $REPORT/s3.json | sponge $REPORT/temp.json
done
With a little bit more refactoring, you could also immediately provide the bucket name to the first inner jq call, which can already create the final objects, which can then be fed as an array to the outer jq call using --slurpfile, for instance. This would move the second inner jq call outside the loop, reducing the overall number of calls by one order.
jq --slurpfile buckets <(
jq -r '.[][0].response.Buckets[].Name' $REPORT/s3.json |
while IFS= read -r bucket; do
aws s3api get-bucket-location --bucket $bucket | jq --arg bucket "$bucket" '{Name: $bucket, region: .LocationConstraint}'
done
) '.[][0].response.Buckets = $buckets' $REPORT/s3.json | sponge $REPORT/temp.json
Related
{
"a": "jdsdjhsandks"
}
How can I compute modular hash of a field using JQ expression?
jq does not implement hash functions, you have to export the data, apply an external tool and re-import the hash.
For instance, if your JSON lived in a file called input.json and you were using bash to call jq, you could do:
# Export the data
data="$(jq -r '.a' input.json)"
# Apply an external tool
md5sum="$(printf '%.32s' "$(md5sum <<< "${data}")")"
# Re-import the hash
jq --arg md5sum "${md5sum}" '.a_md5 = $md5sum' input.json
or without using variables
jq --arg md5sum "$(
printf '%.32s' "$(
md5sum <<< "$(
jq -r '.a' input.json
)"
)"
)" '.a_md5 = $md5sum' input.json
How do I filter using jq which contains string "edp-api-dev"
{
"serviceArns": [
"arn:aws:ecs:us-east-1:1234:service/splat-dev/abc-api-dev-ecs-abc-api-man-1920299",
"arn:aws:ecs:us-east-1:1234:service/edp-api-dev-ecs-edp-api-man-721g8a7d",
"arn:aws:ecs:us-east-1:1234:service/tsm-frontend-dev-ecs-tsm-frontend-man",
"arn:aws:ecs:us-east-1:1234:service/doc-svc-dev-ecs-doc-svc-man",
"arn:aws:ecs:us-east-1:1234:service/wwk-frontend-dev-ecs-wwk-frontend-man-8fea6a0b",
"arn:aws:ecs:us-east-1:1234:service/xyaz-fsse-ecs-xyaz-fsse-man"
]
}
I tried
aws ecs list-services --cluster splat-dev --profile mfa | jq -r '.serviceArns[] | select( . | contains("edp-api-dev")'
but get syntax error
I forgot the closing parenthesis:
aws ecs list-services --cluster splat-dev --profile mfa |
jq -r '.serviceArns[] | select(contains("edp-api-dev"))'
# ^
I have limited experience with jq and am having an issue doing a a select contains for a string in a boolean. This is my json and am looking to get back just tdonn.
[
"user",
"admin"
]
[
[
"tdonn",
true
]
]
Here is what im trying. I have tried many different ways too.
jq -e -r '.results[] | .series[] | select(.values[] | contains("tdon"))[]'
With the sample JSON shown in a comment, the following filter would produce the result shown:
.results[] | .series[][] | flatten[] | select(contains("tdon")?)
Output with -r option
tdonn
You might like to consider:
jq '.. | strings | select(contains("tdon"))'
I'm trying to filter AWS ECR image list returned as JSON with jq and regular expressions.
Following command work as expected and return filtered list:
aws ecr list-images --registry-id 123456789012 --repository-name repo | jq '.imageIds | map(select(.imageTag)) | map(select(.imageTag | test("[a-z0-9]-[0-9]")))'
[
{
"imageTag": "bbe3d9-2",
"imageDigest": "sha256:4c0e92098010fd26e07962eb6e9c7d23315bd8f53885a0651d06c2e2e051317d"
},
{
"imageTag": "3c840a-1",
"imageDigest": "sha256:9d05e04ccd18f2795121118bf0407b0636b9567c022908550c54e3c77534f8c1"
},
{
"imageTag": "1c0d05-141",
"imageDigest": "sha256:a62faabb9199bfc449f0e0a6d3cdc9be57b688a0890f43684d6d89abcf909ada"
}
]
But when I try to pass regular expression as an argument to jq it return an empty array.
aws ecr list-images --registry-id 123456789012 --repository-name repo | jq --arg reg_exp "[a-z0-9]-[0-9]" '.imageIds | map(select(.imageTag)) | map(select(.imageTag | test("$reg_exp")))'
[]
I have tried multiple ways to pass that variable, but just can't get it work. Other relevant information may be that I'm using zsh on mac and my jq version is jq-1.5. Any help is appreciated.
$reg_exp is a variable referring to your regular expression, "$reg_exp" is just a literal string. Remove the quotes. (and that extra map/select is redundant)
jq --arg reg_exp "[a-z0-9]-[0-9]" '.imageIds | map(select(.imageTag | test($reg_exp)))'
I am trying to use jq 1.5 to develop a script that can take one or more user inputs that represent a key and recursively remove them from JSON input.
The JSON I am referencing is here:
https://github.com/EmersonElectricCo/fsf/blob/master/docs/Test.json
My script, which seems to work pretty well, is as follows.
def post_recurse(f):
def r:
(f | select(. != null) | r), .;
r;
def post_recurse:
post_recurse(.[]?);
(post_recurse | objects) |= del(.META_BASIC_INFO)
However, I would like to replace META_BASIC_INFO with one or more user inputs. How would I go about accomplishing this? I presume with --arg from the command line, but I am unclear on how to incorporate this into my .jq script?
I've tried replacing del(.META_BASIC_INFO) with del(.$module) and invoking with cat test.json | ./jq -f fsf_key_filter.jq --arg module META_BASIC_INFO to test but this does not work.
Any guideance on this is greatly appreciated!
ANSWER:
Based on a couple of suggestions I was able to arrive to the following that works and users JQ.
Innvocation:
cat test.json | jq --argjson delete '["META_BASIC_INFO","SCAN_YARA"]' -f fsf_module_filter.jq
Code:
def post_recurse(f):
def r:
(f | select(. != null) | r), .;
r;
def post_recurse:
post_recurse(.[]?);
(post_recurse | objects) |= reduce $delete[] as $d (.; delpaths([[ $d ]]))
It seems the name module is a keyword in 1.5 so $module will result in a syntax error. You should use a different name. There are other builtins to do recursion for you, consider using them instead of churning out your own.
$ jq '(.. | objects | select(has($a))) |= del(.[$a])' --arg a "META_BASIC_INFO" Test.json
You could also use delpaths/1. For example:
$ jq -n '{"a":1, "b": 1} | delpaths([["a"]])'
{
"b": 1
}
That is, modifying your program so that the last line reads like this:
(post_recurse | objects) |= delpaths([[ $delete ]] )
you would invoke jq like so:
$ jq --arg delete "META_BASIC_INFO" -f delete.jq input.json
(One cannot use --arg module ... as "$module" has some kind of reserved status.)
Here's a "one-line" solution using walk/1:
jq --arg d "META_BASIC_INFO" 'walk(if type == "object" then del(.[$d]) else . end)' input.json
If walk/1 is not in your jq, here is its definition:
# Apply f to composite entities recursively, and to atoms
def walk(f):
. as $in
| if type == "object" then
reduce keys[] as $key
( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f
elif type == "array" then map( walk(f) ) | f
else f
end;
If you want to recursively delete a bunch of key-value pairs, then here's one approach using --argjson:
rdelete.jq:
def rdelete(key):
walk(if type == "object" then del(.[key]) else . end);
reduce $strings[] as $s (.; rdelete($s))
Invocation:
$ jq --argjson strings '["a","b"]' -f rdelete.jq input.json