How to compute mod hash of a field using JQ? - jq

{
"a": "jdsdjhsandks"
}
How can I compute modular hash of a field using JQ expression?

jq does not implement hash functions, you have to export the data, apply an external tool and re-import the hash.
For instance, if your JSON lived in a file called input.json and you were using bash to call jq, you could do:
# Export the data
data="$(jq -r '.a' input.json)"
# Apply an external tool
md5sum="$(printf '%.32s' "$(md5sum <<< "${data}")")"
# Re-import the hash
jq --arg md5sum "${md5sum}" '.a_md5 = $md5sum' input.json
or without using variables
jq --arg md5sum "$(
printf '%.32s' "$(
md5sum <<< "$(
jq -r '.a' input.json
)"
)"
)" '.a_md5 = $md5sum' input.json

Related

Add key to array of objects and persist the change using jq

I am looping over array of s3 buckets in a json file.
[
[
"response": {
"Buckets: : [
{"Name": "foo"},
{"Name": "bar"}
]
}
]
]
I want to loop over each bucket, call the aws s3 api to get the region for each bucket and append the {"region": "region_name"} for each object inside Buckets array and persist the changes to file.
I am struggling to write the modified data to file as such that it doesn't lose all the data.
Below script writes to a temp.json file but it keeps overwriting the data for each run. In the end I only see last element in the Bucket array written to temp.json file
I want only the region key added to each element but keep all the contents of file same.
jq -r '.[][0].response.Buckets[].Name' $REPORT/s3.json |
while IFS= read -r bucket
do
region=$(aws s3api get-bucket-location --bucket $bucket | jq -r '.LocationConstraint')
jq -r --arg bucket "$bucket" --arg region "$region" '.[][0].response.Buckets[] | select(.Name==$bucket) | .region=$region' $REPORT/s3.json | sponge $REPORT/temp.json
done
Keep the context by adding parentheses to the LHS as in (.[][0].response.Buckets[] | select(.Name==$bucket)).region=$region.
jq -r '.[][0].response.Buckets[].Name' $REPORT/s3.json |
while IFS= read -r bucket
do
region=$(aws s3api get-bucket-location --bucket $bucket | jq -r '.LocationConstraint')
jq -r --arg bucket "$bucket" --arg region "$region" '(.[][0].response.Buckets[] | select(.Name==$bucket)).region=$region' $REPORT/s3.json | sponge $REPORT/temp.json
done
With a little bit more refactoring, you could also immediately provide the bucket name to the first inner jq call, which can already create the final objects, which can then be fed as an array to the outer jq call using --slurpfile, for instance. This would move the second inner jq call outside the loop, reducing the overall number of calls by one order.
jq --slurpfile buckets <(
jq -r '.[][0].response.Buckets[].Name' $REPORT/s3.json |
while IFS= read -r bucket; do
aws s3api get-bucket-location --bucket $bucket | jq --arg bucket "$bucket" '{Name: $bucket, region: .LocationConstraint}'
done
) '.[][0].response.Buckets = $buckets' $REPORT/s3.json | sponge $REPORT/temp.json

jq - how to use variable in an expression

As in Passing bash variable to jq, we should be able to use a JQ variable as $VAR in a jq expression.
projectID=$(jq -r --arg EMAILID "$EMAILID" '
.resource[]
| select(.username==$EMAILID)
| .id' file.json
)
SO to extract project_id from the json file sample.json.
{
"dev": {
"gcp": {
"project_id": "forecast-dev-1234",
"project_number": "123456789",
"endpoint_id": "6837352639743655936"
}
}
}
Run the JQ expression using a variable but did not work.
# TARGET=dev
$ jq -r --arg TARGET "${TARGET}" '.$TARGET.gcp.project_id' sample.json
-----
jq: error: syntax error, unexpected '
.$TARGET.gcp.project_id
(Unix shell quoting issues?) at <top-level>, line 1:
.$TARGET.gcp.project_id
jq: error: try .["field"] instead of .field for unusually named fields at <top-level>, line 1:
.$TARGET.gcp.project_id
jq: 2 compile errors
Please help understand why and how to use the variable to form an expression to extract project_id.
JQ Manual does not provide clear explanation for variable and ---arg. Is there a good resource that clearly explain JQ variable and how to use it?
Another way to set the exit status is with the halt_error builtin function.
--arg name value:
This option passes a value to the jq program as a predefined variable. If you run jq with --arg foo bar, then $foo is available in the program and has the value "bar". Note that value will be treated as a string, so --arg foo 123 will bind $foo to "123".
Workaround
Using interpolation.
$ TARGET=dev
$ jq -r --arg TARGET "${TARGET}" '."\($TARGET)".gcp.project_id' sample_interpolation.json
-----
forecast-dev-1234
Version
jq --version
---
jq-1.64l
You're using variables fine, the problem is that object-identifier syntax doesn't allow general expressions. It's a shorthand syntax for when the key you're looking up is a fixed identifier-like string, like .foo or .project_id. As noted in the manual, you can use the more general generic object index filter for arbitrary keys including those that are calculated by some expression, such as .[$TARGET]:
$ TARGET=dev
$ jq -r --arg TARGET "${TARGET}" '.[$TARGET].gcp.project_id' sample.json
forecast-dev-1234

How can I efficiently extract variables from JSON with xonsh?

Given a variable AWS_ASSUMED_ROLE that contains the output of the aws sts assume-role (a JSON string), I can write the following in bash.
export AWS_ACCESS_KEY_ID=$( jq -r '.Credentials.AccessKeyId' <<<$AWS_ASSUMED_ROLE )
export AWS_SECRET_ACCESS_KEY=$( jq -r '.Credentials.SecretAccessKey' <<<$AWS_ASSUMED_ROLE )
export AWS_SESSION_TOKEN=$( jq -r '.Credentials.SessionToken' <<<$AWS_ASSUMED_ROLE )
aws sts get-caller-identity
However, in order to get the same functionality in xonsh, I need two echo commands.
$AWS_ACCESS_KEY_ID = $( echo -n #$( echo #(AWS_ASSUMED_ROLE) | jq -r '.Credentials.AccessKeyId') )
$AWS_SECRET_ACCESS_KEY = $( echo -n #$( echo #(AWS_ASSUMED_ROLE) | jq -r '.Credentials.SecretAccessKey' ) )
$AWS_SESSION_TOKEN = $( echo -n #$( echo #(AWS_ASSUMED_ROLE) | jq -r '.Credentials.SessionToken' ) )
aws sts get-caller-identity
The inner one to provide jq with the input data. The outer one to be able to set the corresponding environment variable with a string value without a new line.
Okay, a little awkward but not too bad. However, is there a better way to do it?
jq is a great tool -- for this particular case with xonsh, I'd lean on the json module instead, though.
Assuming that AWS_ASSUMED_ROLE is some stringified json blob:
import json
blob = json.loads(AWS_ASSUMED_ROLE)
$AWS_ACCESS_KEY_ID = blob["Credentials"]["AccessKeyId"]
...

Extract fields from json using jq

I am trying write a shell script that will get some json from URL and parse the json and extract fields.
This is what is done so far.
#!/bin/bash
token=$(http POST :3000/signin/frontm user:='{"email": "sourav#frontm.com", "password": "Hello_789"}' | jq -r '.data.id_token')
cred=$(http POST :3000/auth provider_name:frontm token:$token user:=#/tmp/user.json | jq '{ creds: .creds, userUuid: .user.userId }')
echo $cred
access=$(jq -r "'$cred'")
echo $access
So the output from echo $cred is a json:
Eg:
{ "creds": { "accessKeyId": "ASIAJPM3RDAZXEORAQ5Q", "secretAccessK
ey": "krg5GbU6gtQV+a5pz4ChL+ECVJm+wKogjglXOqr6", "sessionToken": "Ag
oGb3JpZ2luEAYaCXVzLWVhc3QtMSKAAmhOg7fedV+sBw+8c45HL9naPjqbC0bwaBxq
mQ9Kuvnirob8KtTcsiBkJA/OfCTpYNUFaXXYfUPvbmW5UveDJd+32Cb5Ce+3lAOkkL
aZyWJgvhM1u53WNuMekhcZX7SnlCcaO4e/A9TR74qMOsVptonw5jFB5zjbEI4hFsVX
UHXtkYMYpSyG+2P2LxWRqTg4XKcg2vT+qrLtiXu3XNK70wuCe0/L4/HjjzlLvChmhe
TRs8u8ZRcJvSim/j1sLqe85Sl1qrFv/7msCaxUa3gZ3dOcfHliH64+8NHfS1tkaVkS
iM2x4wxTdZI/SafduFDvGCsltxe9p5zQD0Jb1Qe02ccqpgUIWxAAGgw3NzE5NTYwMD
EyNDciDOQZkq8t+c7WatNLHyqDBahqpQwxpGsYODIC1Db/M4+PXmuYMdYKLwjv3Df2
JeTMw2RT1h8M0IOOPvyBWetwB42HLhv5AobIMkNVSw6tpGyZC/bLMGJatptB0hVMBg
/80VnI7pTPiSjb/LG46bbwlbJevPoorCEEqMZ3MlAJ2Xt2hMmA+sHBRRvV1hlkMnS8
NW6w9xApSGrD001zdfFkmBbHw+c4vmX+TMT7Bw0bHQZ5FQSpEBOw9M5sNOIoa+G/pP
p4WoHiYfGHzaXGQe9Iac07Fy36W/WRebZapvF7TWoIpBjAV+IrQKP3ShJdBi3Oa6py
lGUQysPa3EN0AF/gDuTsdz7TDsErzzUERfQHksK495poG92YoG2/ir8yqTQtUDvshO
7U4SbFpUrozCYT6vp7++BWnpe+miIRCvjy2spqBqv2RY6lhgC6QPfS/365T+QbSTMc
R+ZNes0gX/QrEG4q1sMoxyTltL4sXS2Dz9UXywPkg78AWCOr34ii72m/67Gqe1P3KA
vBe9xF9Hem4H1WbYAqBN76ppyJyG17qK8b2/r71c8rdY+1gYcskV1vUfTQUVCGE0y2
JXKV2UMFOwoTzy6SFIGcuTeOAHiYPgTkMZ6X7hNjf56ihzBIbhSHaST8U4eNBka8j8
Y949ilJwz9QO0l1kwdb2+fQSMblHgeYvF1P8HxBSpRA28gKkkXMf73Zk27I3O2DRGb
lcXS4tKRvan4ASTi4qkdrvVwMT5mwJI4mGIJZSiMJqPxjVh5E9OicFbIOCRcbcIRDE
mj5t9EvaSbIm4ELBMuyoFjmKJmesE03uFRcHkEXkPBxhkJbQwkJeUxHll5kR1IYzvA
K2A2EiZqjkhiSJC4NRekEuM+5WowwuWw1wU=" }, "userUuid": "mugqRKHmTPxk
obBAtwTmKk" }
So basically I am stuck here .. how do i parse this json in $cred further and basically want to get access to say accessKeyId using jq further?
I wonder if the variable $cred really holds a string formated in 67 columns, but if that so, tr might help to remove the newline and extract the accessKeyId using jq:
echo "$cred" | tr -d '\n' | jq -r '.creds.accessKeyId'

extracting value from a json response unix

I have a JSON response stored in a variable
{
"StepIds": [
"s-12AB34Cdb"
]
}
how to I extract a value using sed on unix and store it into a variable
I wouldn't use sed. You can use the jq command-line JSON parser:
ID=$(echo $VAR | jq -r '.StepIds[0]')
echo $ID
Output:
s-12AB34Cdb

Resources